Cutshort logo
Geospatial analysis jobs

11+ Geospatial analysis Jobs in India

Apply to 11+ Geospatial analysis Jobs on CutShort.io. Find your next job, effortlessly. Browse Geospatial analysis Jobs and apply today!

icon
Yulu Bikes

at Yulu Bikes

1 video
3 recruiters
Keerthana k
Posted by Keerthana k
Bengaluru (Bangalore)
1 - 2 yrs
₹7L - ₹12L / yr
skill iconData Science
skill iconData Analytics
SQL
skill iconPython
Datawarehousing
+2 more
Skill Set 
SQL, Python, Numpy,Pandas,Knowledge of Hive and Data warehousing concept will be a plus point.

JD 

- Strong analytical skills with the ability to collect, organise, analyse and interpret trends or patterns in complex data sets and provide reports & visualisations.

- Work with management to prioritise business KPIs and information needs Locate and define new process improvement opportunities.

- Technical expertise with data models, database design and development, data mining and segmentation techniques

- Proven success in a collaborative, team-oriented environment

- Working experience with geospatial data will be a plus.
Read more
Innovative Fintech Startup

Innovative Fintech Startup

Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
5 - 12 yrs
₹28L - ₹55L / yr
skill iconData Science
Data Scientist
skill iconMachine Learning (ML)
skill iconPython
Statistical Modeling
+2 more
Lead a team of data scientists from top tier schools and collaborate with Founders and business
heads to solve complex business problems
- Develop statistical, and machine learning-based models/pipelines/methods to improve business
processes and engagements
- Conduct sophisticated data mining analyses of large volumes of data and build data science
models, as required, as part of the credit and risk underwriting solutions; customer engagement and
retention; new business initiatives; business process improvements
- Translate data mining results into a clear business-focused deliverable for decisionmakers
- Working with Application Developers on integrating machine learning algorithms and data mining
models into operational systems so it could lead to automation, productivity increase, and time
savings
- Provide the technical direction required to resolve complex issues to ensure the on-time delivery of
solutions that meet the business team’s expectations. May need to develop new methods to apply
to situations
- Knowledge of how to leverage statistical models in algorithms is a must
- Experience in multivariate analysis; identifying how several parameters can affect
retention/behaviour of the customer and identifying actions at different points of the customer lifecycle

Extensive experience coding in Python and having mentored teams to learn the same
- Great understanding of the data science landscape and what tools to leverage for different
problems
- A great structured thinker that could bring structure to any data science problem quickly
- Ability to visualize data stories and adept in data visualization tools and present insights as cohesive
stories to senior leadership
- Excellent capability to organize large data sets collected from many sources (web APIs and internal
databases) to get actionable insights
- Initiate data science programs in the team and collaborate across other data science teams to build
a knowledge database
Read more
With Reputed service based company

With Reputed service based company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
4 - 6 yrs
₹12L - ₹15L / yr
SQL
MySQL
MySQL DBA
MariaDB
MS SQLServer
Role Description
As a Database Administrator, you will be responsible for designing, testing, planning,
implementing, protecting, operating, managing and maintaining our company’s
databases. The goal is to provide a seamless flow of information throughout the

company, considering both backend data structure and frontend accessibility for end-
users. You get to work with some of the best minds in the industry at a place where

opportunity lurks everywhere and in everything.
Responsibilities
Your responsibilities are as follows.
• Build database systems of high availability and quality depending on each end
user’s specialised role
• Design and implement database in accordance to end users’ information needs
and views
• Define users and enable data distribution to the right user, in appropriate format
and in a timely manner
• Use high-speed transaction recovery techniques and backup data
• Minimise database downtime and manage parameters to provide fast query
responses
• Provide proactive and reactive data management support and training to users
• Determine, enforce and document database policies, procedures and
standards
• Perform tests and evaluations regularly to ensure data security, privacy and
integrity
• Monitor database performance, implement changes and apply new patches
and versions when required
Required Qualifications
We are looking for individuals who are curious, excited about learning, and navigating
through the uncertainties and complexities that are associated with a growing
company. Some qualifications that we think would help you thrive in this role are:
• Minimum 4 Years of experience as a Database Administrator
• Hands-on experience with database standards and end user applications
• Excellent knowledge of data backup, recovery, security, integrity and SQL
• Familiarity with database design, documentation and coding
• Previous experience with DBA case tools (frontend/backend) and third-party
tools
• Familiarity with programming languages API
• Problem solving skills and ability to think algorithmically
• Bachelor/Masters of CS/IT Engineering, BCA/MCA, B Sc/M Sc in CS/IT

Preferred Qualifications
• Sense of ownership and pride in your performance and its impact on company’s
success
• Critical thinker and problem-solving skills
• Team player
• Good time-management skills
• Great interpersonal and communication skills.
Read more
GradRight

at GradRight

4 recruiters
Vivek Jadli
Posted by Vivek Jadli
Gurugram
4 - 10 yrs
Best in industry
Tableau
skill iconData Analytics
Data Visualization
Data management
Business Intelligence (BI)
+1 more

Brief:

As a BI Developer at GradRight, you’ll be working with Tableau and supporting data sources to build reports for the requirements of various business teams.

 

Responsibilities:

  1. Translate business needs to technical specifications for reports and dashboards
  2. Design, build and deploy BI solutions
  3. Maintain and support data analytics platforms (e.g. Tableau, Mixpanel, Google Analytics, etc)
  4. Evaluate and improve existing BI systems
  5. Collaborate with teams to integrate systems
  6. Develop and execute database queries, conduct analysis and prepare data to be shared with respective stakeholders
  7. Create visualizations and reports for requested projects
  8. Develop and update technical documentation around reports


Requirements:

  1. At least 3 years of proven experience as a BI Developer
  2. Experience at a startup
  3. Background in data warehouse design (e.g. dimensional modeling) and data mining
  4. In-depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework
  5. Working knowledge of Tableau
  6. Knowledge of SQL queries and MongoDB
  7. Proven abilities to take initiative and be innovative
  8. Analytical mind with a problem-solving aptitude
Read more
Curl

Curl

Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹25L / yr
Data Visualization
PowerBI
ETL
Business Intelligence (BI)
skill iconData Analytics
+6 more
Main Responsibilities:

 Work closely with different Front Office and Support Function stakeholders including but not restricted to Business
Management, Accounts, Regulatory Reporting, Operations, Risk, Compliance, HR on all data collection and reporting use cases.
 Collaborate with Business and Technology teams to understand enterprise data, create an innovative narrative to explain, engage and enlighten regular staff members as well as executive leadership with data-driven storytelling
 Solve data consumption and visualization through data as a service distribution model
 Articulate findings clearly and concisely for different target use cases, including through presentations, design solutions, visualizations
 Perform Adhoc / automated report generation tasks using Power BI, Oracle BI, Informatica
 Perform data access/transfer and ETL automation tasks using Python, SQL, OLAP / OLTP, RESTful APIs, and IT tools (CFT, MQ-Series, Control-M, etc.)
 Provide support and maintain the availability of BI applications irrespective of the hosting location
 Resolve issues escalated from Business and Functional areas on data quality, accuracy, and availability, provide incident-related communications promptly
 Work with strict deadlines on high priority regulatory reports
 Serve as a liaison between business and technology to ensure that data related business requirements for protecting sensitive data are clearly defined, communicated, and well understood, and considered as part of operational
prioritization and planning
 To work for APAC Chief Data Office and coordinate with a fully decentralized team across different locations in APAC and global HQ (Paris).

General Skills:
 Excellent knowledge of RDBMS and hands-on experience with complex SQL is a must, some experience in NoSQL and Big Data Technologies like Hive and Spark would be a plus
 Experience with industrialized reporting on BI tools like PowerBI, Informatica
 Knowledge of data related industry best practices in the highly regulated CIB industry, experience with regulatory report generation for financial institutions
 Knowledge of industry-leading data access, data security, Master Data, and Reference Data Management, and establishing data lineage
 5+ years experience on Data Visualization / Business Intelligence / ETL developer roles
 Ability to multi-task and manage various projects simultaneously
 Attention to detail
 Ability to present to Senior Management, ExCo; excellent written and verbal communication skills
Read more
Sizzle

at Sizzle

1 recruiter
Vijay Koduri
Posted by Vijay Koduri
Bengaluru (Bangalore)
0 - 3 yrs
₹1.5L - ₹1.8L / yr
skill iconData Analytics
Data Annotation
Natural Language Processing (NLP)
Computer Vision
data annotation

Sizzle is an exciting new startup that’s changing the world of gaming.  At Sizzle, we’re building AI to automate gaming highlights, directly from Twitch and YouTube streams. 


For this role, we're looking for someone that ideally loves to watch video gaming content on Twitch and YouTube. Specifically, you will help generate training data for all the AI we are building. This will include gathering screenshots, clips and other data from gaming videos on Twitch and YouTube.  You will then be responsible for labeling and annotating them. You will work very closely with our AI engineers.


You will:

  • Gather training data as specified by the management and engineering team
  • Label and annotate all the training data
  • Ensure all data is prepped and ready to feed into the AI models
  • Revise the training data as specified by the engineering team
  • Test the output of the AI models and update training data needs

You should have the following qualities:

  • Willingness to work hard and hit deadlines
  • Work well with people
  • Be able to work remotely (if not in Bangalore)
  • Interested in learning about AI and computer vision
  • Willingness to learn rapidly on the job
  • Ideally a gamer or someone interested in watching gaming content online

Skills:

Data labeling, annotation, AI, computer vision, gaming


Work Experience:  0 years to 3 years


About Sizzle

Sizzle is building AI to automate gaming highlights, directly from Twitch and YouTube videos. Presently, there are over 700 million fans around the world that watch gaming videos on Twitch and YouTube. Sizzle is creating a new highlights experience for these fans, so they can catch up on their favorite streamers and esports leagues. Sizzle is available at http://www.sizzle.gg">www.sizzle.gg.
Read more
Ganit Business Solutions

at Ganit Business Solutions

3 recruiters
Viswanath Subramanian
Posted by Viswanath Subramanian
Chennai, Bengaluru (Bangalore), Mumbai
4 - 6 yrs
₹7L - ₹15L / yr
SQL
skill iconAmazon Web Services (AWS)
Data Warehouse (DWH)
Informatica
ETL
+1 more

Responsibilities:

  • Must be able to write quality code and build secure, highly available systems.
  • Assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing datadelivery, re-designing infrastructure for greater scalability, etc with the guidance.
  • Create datatools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Monitoring performance and advising any necessary infrastructure changes.
  • Defining dataretention policies.
  • Implementing the ETL process and optimal data pipeline architecture
  • Build analytics tools that utilize the datapipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Create design documents that describe the functionality, capacity, architecture, and process.
  • Develop, test, and implement datasolutions based on finalized design documents.
  • Work with dataand analytics experts to strive for greater functionality in our data
  • Proactively identify potential production issues and recommend and implement solutions

Skillsets:

  • Good understanding of optimal extraction, transformation, and loading of datafrom a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Proficient understanding of distributed computing principles
  • Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
  • Implemented complex projects dealing with the considerable datasize (PB).
  • Optimization techniques (performance, scalability, monitoring, etc.)
  • Experience with integration of datafrom multiple data sources
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
  • Knowledge of various ETL techniques and frameworks, such as Flume
  • Experience with various messaging systems, such as Kafka or RabbitMQ
  • Good understanding of Lambda Architecture, along with its advantages and drawbacks
  • Creation of DAGs for dataengineering
  • Expert at Python /Scala programming, especially for dataengineering/ ETL purposes
Read more
Avhan Technologies Pvt Ltd
Aarti Vohra
Posted by Aarti Vohra
Kolkata
7 - 10 yrs
₹8L - ₹20L / yr
MDX
DAX
SQL
SQL server
Microsoft Analysis Services
+3 more
Exp : 7 to 8 years
Notice Period: Immediate to 15 days
Job Location : Kolkata
 
Responsibilities:
• Develop and improve solutions spanning data processing activities from the data lake (stage) to star schemas and reporting view’s / tables and finally into SSAS.
• Develop and improve Microsoft Analysis Services cubes (tabular and dimensional)
• Collaborate with other teams within the organization and be able to devise the technical solution as it relates to the business & technical requirements
• Mentor team members and be proactive in training and coaching team members to develop their proficiency in Analysis Services
• Maintain documentation for all processes implemented
• Adhere to and suggest improvements to coding standards, applying best practices
 
Skillsets:
• Proficient in MDX and DAX for query in SSAS
Read more
Product Development

Product Development

Agency job
via Purple Hirez by Aditya K
Hyderabad
12 - 20 yrs
₹15L - ₹50L / yr
Analytics
skill iconData Analytics
skill iconKubernetes
PySpark
skill iconPython
+1 more

Job Description

We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.

Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.

 

Skills

  • Bachelors/Masters/Phd in CS or equivalent industry experience
  • Demonstrated expertise of building and shipping cloud native applications
  • 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
  • Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
  • Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
  • 5+ Industry experience in python
  • Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
  • Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
  • Implementing automated testing platforms and unit tests
  • Proficient understanding of code versioning tools, such as Git
  • Familiarity with continuous integration, Jenkins

Responsibilities

  • Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
  • Create custom Operators for Kubernetes, Kubeflow
  • Develop data ingestion processes and ETLs
  • Assist in dev ops operations
  • Design and Implement APIs
  • Identify performance bottlenecks and bugs, and devise solutions to these problems
  • Help maintain code quality, organization, and documentation
  • Communicate with stakeholders regarding various aspects of solution.
  • Mentor team members on best practices
Read more
Anaxee Technologies
Indore
0 - 1 yrs
₹1L - ₹2L / yr
skill iconData Science
Internship
skill iconData Analytics
Data Structures

Job role:

As a data analyst, you will be responsible for compiling actionable insights from data and assisting program, sales and marketing managers build data-driven processes. Your role will involve driving initiatives to optimize for operational excellence and revenue.


Job Location: Indore | Full-Time Internship  | Stipend - Performance Based  | 


About the company:

Anaxee Digital Runners is building India's largest last-mile verification & data collection network of Digital Runners (shared feet-on-street, tech-enabled) to help Businesses & Consumers reach remotest parts of India, on-demand.  KYC | Field Verification | Data Collection | eSign | Tier-2, 3 & 4

Sounds like a moonshot? It is. We want to make REACH across India (remotest places), as easy as ordering pizza, on-demand. Already serving 11000 pin codes (57% of India) | Website: www.anaxee.com

Important: https://www.youtube.com/watch?v=7QnyJsKedz8" target="_blank">Check out our company pitch (6 min video) to understand this goal - https://www.youtube.com/watch?v=7QnyJsKedz8" target="_blank">https://www.youtube.com/watch?v=7QnyJsKedz8


Responsibilities:

  • Ensure that data flows smoothly from source to destination so that it can be processed
  • Utilize strong database skills to work with large, complex data sets to extract insights
  • Filter and cleanse unstructured (or ambiguous) data into usable data sets that can be analyzed to extract insights and improve business processes
  • Identify new internal and external data sources to support analytics initiatives and work with appropriate partners to absorb the data into new or existing data infrastructure
  • Build tools for automating repetitive tasks so that bandwidth can be freed for analytics
  • Collaborate with program managers and business analysts  to help them come up with actionable, high-impact insights across product lines and functions
  • Work closely with top management to prioritize information and analytic needs


Requirements:

  • Bachelors or Masters (Pursuing or Graduated) in a quantitative field (such as Engineering, Statistics, Math, Economics, or Computer Science with Modeling/Data Science), preferably with work experience of over [X] years.
  • Ability to program in any high-level language is required. Familiarity with R and statistical packages are preferred.
  • Proven problem solving and debugging skills.
  • Familiar with database technologies and tools (SQL/R/SAS/JMP etc.), data warehousing, transformation, and processing. Work experience with real data for customer insights, business, and market analysis will be advantageous.
  • Experience with text analytics, data mining and social media analytics.
  • Statistical knowledge in standard techniques: Logistic Regression, Classification models, Cluster Analysis, Neural Networks, Random Forests, Ensembles, etc.
Read more
TintED
Kumar Aniket
Posted by Kumar Aniket
Remote, Kolkata
0 - 4 yrs
₹3L - ₹7L / yr
skill iconData Science
skill iconPython
skill iconR Programming
We aim to transform recruiting industry.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort