11+ Geospatial analysis Jobs in India
Apply to 11+ Geospatial analysis Jobs on CutShort.io. Find your next job, effortlessly. Browse Geospatial analysis Jobs and apply today!
SQL, Python, Numpy,Pandas,Knowledge of Hive and Data warehousing concept will be a plus point.
JD
- Strong analytical skills with the ability to collect, organise, analyse and interpret trends or patterns in complex data sets and provide reports & visualisations.
- Work with management to prioritise business KPIs and information needs Locate and define new process improvement opportunities.
- Technical expertise with data models, database design and development, data mining and segmentation techniques
- Proven success in a collaborative, team-oriented environment
- Working experience with geospatial data will be a plus.
-Design and maintain efficient database solutions using RDBMS.
-Write complex SQL queries for data extraction and manipulation.
-Implement and optimize AWS services for scalable application deployment.
-Develop server-side logic using PHP and integrate front-end elements with JavaScript.
-Collaborate with teams to design, develop, and deploy web applications.
ABOUT EPISOURCE:
Episource has devoted more than a decade in building solutions for risk adjustment to measure healthcare outcomes. As one of the leading companies in healthcare, we have helped numerous clients optimize their medical records, data, analytics to enable better documentation of care for patients with chronic diseases.
The backbone of our consistent success has been our obsession with data and technology. At Episource, all of our strategic initiatives start with the question - how can data be “deployed”? Our analytics platforms and datalakes ingest huge quantities of data daily, to help our clients deliver services. We have also built our own machine learning and NLP platform to infuse added productivity and efficiency into our workflow. Combined, these build a foundation of tools and practices used by quantitative staff across the company.
What’s our poison you ask? We work with most of the popular frameworks and technologies like Spark, Airflow, Ansible, Terraform, Docker, ELK. For machine learning and NLP, we are big fans of keras, spacy, scikit-learn, pandas and numpy. AWS and serverless platforms help us stitch these together to stay ahead of the curve.
ABOUT THE ROLE:
We’re looking to hire someone to help scale Machine Learning and NLP efforts at Episource. You’ll work with the team that develops the models powering Episource’s product focused on NLP driven medical coding. Some of the problems include improving our ICD code recommendations, clinical named entity recognition, improving patient health, clinical suspecting and information extraction from clinical notes.
This is a role for highly technical data engineers who combine outstanding oral and written communication skills, and the ability to code up prototypes and productionalize using a large range of tools, algorithms, and languages. Most importantly they need to have the ability to autonomously plan and organize their work assignments based on high-level team goals.
You will be responsible for setting an agenda to develop and ship data-driven architectures that positively impact the business, working with partners across the company including operations and engineering. You will use research results to shape strategy for the company and help build a foundation of tools and practices used by quantitative staff across the company.
During the course of a typical day with our team, expect to work on one or more projects around the following;
1. Create and maintain optimal data pipeline architectures for ML
2. Develop a strong API ecosystem for ML pipelines
3. Building CI/CD pipelines for ML deployments using Github Actions, Travis, Terraform and Ansible
4. Responsible to design and develop distributed, high volume, high-velocity multi-threaded event processing systems
5. Knowledge of software engineering best practices across the development lifecycle, coding standards, code reviews, source management, build processes, testing, and operations
6. Deploying data pipelines in production using Infrastructure-as-a-Code platforms
7. Designing scalable implementations of the models developed by our Data Science teams
8. Big data and distributed ML with PySpark on AWS EMR, and more!
BASIC REQUIREMENTS
-
Bachelor’s degree or greater in Computer Science, IT or related fields
-
Minimum of 5 years of experience in cloud, DevOps, MLOps & data projects
-
Strong experience with bash scripting, unix environments and building scalable/distributed systems
-
Experience with automation/configuration management using Ansible, Terraform, or equivalent
-
Very strong experience with AWS and Python
-
Experience building CI/CD systems
-
Experience with containerization technologies like Docker, Kubernetes, ECS, EKS or equivalent
-
Ability to build and manage application and performance monitoring processes
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
- Thorough understanding of React.js and its core principles
- Experience with popular React.js workflows (such as Flux or Redux)
- Familiarity with newer specifications of ECMAScript
- Experience with data structure libraries (e.g., Immutable.js)
- Knowledge of isomorphic React is a plus
- Familiarity with RESTful APIs
- Knowledge of modern authorization mechanisms, such as JSON Web Token
- Familiarity with modern front-end build pipelines and tools
- Ability to understand business requirements and translate them into technical requirements
- A knack for benchmarking and optimization
- Familiarity with code versioning tools (such as Git, SVN, and Mercurial)

Streaming Data Integration Startup
- Degree in Computer Science (BS/MS), related technical field or equivalent practical experience
- 8+ years of industry experience in product development
- Experience in coaching and mentoring team members
- Must own a specific component of the product. Gathering requirements by working with product management. Working closely with development managers in defining external product interfaces
- Must have excellent written and verbal communication skills. Must articulate various design approaches and design decisions clearly - within the team and across the teams.
- Must work with QA engineers to devise proper test constructs - unit testing and integration tests.
- Good knowledge of Facebook (App install & Lead Generation campaigns), Google Ads (Display, SEM & YouTube) and Email campaigns.
- Should be data driven & able to ideate copies, creatives and conceptualize campaigns.
- Experience to handle decent sized Digital budgets would be an added advantage.
- Proficiency in Keywords research and keywords competition Analysis
- Proficiency in creating reports using Google Sheets/Excel.
- The Digital Specialist will be tasked with analyzing user experience data, initiating digital projects, reporting on planned and current strategies, leading effective digital marketing strategies, and ensuring that projects are executed within budget.
- You will play a pivotal role in growing our business, customer base, and improving user experience.
- Brainstorm new and creative growth strategies
- Research on latest digital tools and interactive trends for effective digital marketing.
Ideal Candidate
- 2+ years of hands-on experience in managing digital marketing campaigns.
- Knowledge of Analytic tools like Google Analytics, etc.
- Proficient in Google sheets/Excel
- Familiarity with social media trends
- Excellent communication skills
- Creativity and commercial awareness
- Proven Experience of handling a team
We are hiring Telesales Representative for SSR TECHVISION Pvt. Ltd.
Position : TeleCaller Executive.
Company: SSR Techvision Private Limited., sector 64, Noida
Experience required : Minimum 6 Month's to 1 Year.
Shift :- Night Shift.
Salary : 20-35k
Timing : 8 Pm to 5:30 Am
Office Work: 5.5 day's Working
Location: Noida
FinGrad is a financial education platform that offers curated webinars and courses by Market Experts and top Instructors to empower financial literacy in India.
Responsibilities:
• Performing research on the finance sector.
• Creating content on trading and investing niche.
• Writing scripts on financial-related topics.
• Finding emerging trends in the financial system.
• Performing in a reel or video while being creative.
• Creating audio podcasts.
Requirements/Skills:
• Graduate.
• Fresher (with stock market knowledge).
• Excellent English speaking skills.
• Familiarity with social media channels.
Good knowledge of managing Linux admin along with application hosting on the server. Excellent
knowledge of Linux commands
Worked on OEL Linux and other Linux version. Manage VAPT requirements, server support, backup
strategy
Knowledge of hosting OBIEE, Informatica and other applications like Essbase will be added adavntage.
Good technical knowledge and ready to always learn new technologies
Configurations of SSL, Port related checking
Technical documentation and debug skills
Coordination with other technical teams
Qualifications
Master or Bachelor degree in Engineering/Computer Science /Information Technology
Additional information
Excellent verbal and written communication skills
Senior Big Data Engineer
Note: Notice Period : 45 days
Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA.
We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure.
It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges.
Key Qualifications
· 5+ years of experience working with Java and Spring technologies
· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations
· Knowledge of microservices architecture is plus
· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra
· Experience with Kafka or any streaming tools
· Knowledge of Scala would be preferable
· Experience with agile application development
· Exposure of any Cloud Technologies including containers and Kubernetes
· Demonstrated experience of performing DevOps for platforms
· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity
· Exposure to Graph databases
· Passion for learning new technologies and the ability to do so quickly
· A Bachelor's degree in a computer-related field or equivalent professional experience is required
Key Responsibilities
· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture
· Design and develop the big data-focused micro-Services
· Involve in big data infrastructure, distributed systems, data modeling, and query processing
· Build software with cutting-edge technologies on cloud
· Willing to learn new technologies and research-orientated projects
· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
Role Description:
As a Lead Backend Engineer , you will be responsible for designing and deploying scalable, highly available, and fault-tolerant systems for Radiusagent. Experience as a Tech Lead, managing tasks with back-end engineers while coding themselves will be a bonus point for the ideal candidate.
Responsibilities will include-
- You will contribute to all aspects of an agile software development lifecycle including design, architecture, development, documentation, testing, and operations.
- You will build cutting-edge scalable systems by writing simple and efficient code.
- You will push your design and architecture limits for new product development.
- You will ensure compliance with the build/release and configuration management process.
Skills needed:
- be able to design and build modules from ground up
- proficient in any one programming language out of php/golang/node/python/javascript/java/ruby and ability to pick up others
- experience with mysql/postgres/mongodb/couchdb
- experience with redis/memcached
- experience with rabbitmq / kafka
- experience in having written cron jobs, troubleshooting downtime
- ability to work with ambiguous requirements
- good understanding of OS concepts
Nice to haves:
- experience with managing VMs' on aws/gcp/digitalocean
- experience with docker/kubernetes
- experience elasticsearch or lucene





