Cutshort logo
MLS Jobs in Bangalore (Bengaluru)

11+ MLS Jobs in Bangalore (Bengaluru) | MLS Job openings in Bangalore (Bengaluru)

Apply to 11+ MLS Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest MLS Job opportunities across top companies like Google, Amazon & Adobe.

icon
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
Microsoft Windows Azure
SQL Azure
skill iconData Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
Leading StartUp Focused On Employee Growth

Leading StartUp Focused On Employee Growth

Agency job
via Qrata by Blessy Fernandes
Bengaluru (Bangalore)
4 - 8 yrs
₹25L - ₹45L / yr
skill iconData Analytics
Data Analyst
Tableau
Mixpanel
CleverTap
+2 more
4+ years of experience in data and analytics.
● Knowledge of Excel,SQL and writing code in python.
● Experience with Reporting and Business Intelligence tools like Tableau, Metabase.
● Exposure with distributed analytics processing technologies is desired (e.g. Hive, Spark).
● Experience with Clevertap, Mixpanel, Amplitude, etc.
● Excellent communication skills.
● Background in market research and project management.
● Attention to detail.
● Problem-solving aptitude.
Read more
OSBIndia Private Limited
Bengaluru (Bangalore), Hyderabad
5 - 12 yrs
₹10L - ₹18L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
Stored Procedures
+3 more

1.      Core Responsibilities

·        Leading solutions for data engineering

·        Maintain the integrity of both the design and the data that is held within the architecture

·        Champion and educate people in the development and use of data engineering best practises

·        Support the Head of Data Engineering and lead by example

·        Contribute to the development of database management services and associated processes relating to the delivery of data solutions

·        Provide requirements analysis, documentation, development, delivery and maintenance of data platforms.

·        Develop database requirements in a structured and logical manner ensuring delivery is aligned with business prioritisation and best practise

·        Design and deliver performance enhancements, application migration processes and version upgrades across a pipeline of BI environments.

·        Provide support for the scoping and delivery of BI capability to internal users.

·        Identify risks and issues and escalate to Line / Project manager. 

·        Work with clients, existing asset owners & their service providers and non BI development staff to clarify and deliver work stream objectives in timescales that deliver to the overall project expectations.

·        Develop and maintain documentation in support of all BI processes.

·        Proactively identify cost-justifiable improvements to data manipulation processes.

·        Research and promote relevant BI tools and processes that contribute to increased efficiency and capability in support of corporate objectives.

·        Promote a culture that embraces change, continuous improvement and a ‘can do’ attitude.

·        Demonstrate enthusiasm and self-motivation at all times.

·        Establish effective working relationships with other internal teams to drive improved efficiency and effective processes.

·        Be a champion for high quality data and use of strategic data repositories, associated relational model, and Data Warehouse for optimising the delivery of accurate, consistent and reliable business intelligence

·        Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.

·        Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.

·        Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.

 

2.      Experience Requirements

·        5 years Data Engineering / ETL development experience is essential

·        5 years data design experience in an MI / BI / Analytics environment (Kimball, lake house, data lake) is essential

·        5 years experience of working in a structured Change Management project lifecycle is essential

·        Experience of working in a financial services environment is desirable

·        Experience of dealing with senior management within a large organisation is desirable

·        5 years experience of developing in conjunction with large complex projects and programmes is desirable

·        Experience mentoring other members of the team on best practise and internal standards is essential

·        Experience with cloud data platforms desirable (Microsoft Azure) is desirable

 

3.      Knowledge Requirements

·        A strong knowledge of business intelligence solutions and an ability to translate this into data solutions for the broader business is essential

·        Strong demonstrable knowledge of data warehouse methodologies

·        Robust understanding of high level business processes is essential

·        Understanding of data migration, including reconciliation, data cleanse and cutover is desirable

Read more
Cubera Tech India Pvt Ltd
Bengaluru (Bangalore), Chennai
5 - 8 yrs
Best in industry
Data engineering
Big Data
skill iconJava
skill iconPython
Hibernate (Java)
+10 more

Data Engineer- Senior

Cubera is a data company revolutionizing big data analytics and Adtech through data share value principles wherein the users entrust their data to us. We refine the art of understanding, processing, extracting, and evaluating the data that is entrusted to us. We are a gateway for brands to increase their lead efficiency as the world moves towards web3.

What are you going to do?

Design & Develop high performance and scalable solutions that meet the needs of our customers.

Closely work with the Product Management, Architects and cross functional teams.

Build and deploy large-scale systems in Java/Python.

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Create data tools for analytics and data scientist team members that assist them in building and optimizing their algorithms.

Follow best practices that can be adopted in Bigdata stack.

Use your engineering experience and technical skills to drive the features and mentor the engineers.

What are we looking for ( Competencies) :

Bachelor’s degree in computer science, computer engineering, or related technical discipline.

Overall 5 to 8 years of programming experience in Java, Python including object-oriented design.

Data handling frameworks: Should have a working knowledge of one or more data handling frameworks like- Hive, Spark, Storm, Flink, Beam, Airflow, Nifi etc.

Data Infrastructure: Should have experience in building, deploying and maintaining applications on popular cloud infrastructure like AWS, GCP etc.

Data Store: Must have expertise in one of general-purpose No-SQL data stores like Elasticsearch, MongoDB, Redis, RedShift, etc.

Strong sense of ownership, focus on quality, responsiveness, efficiency, and innovation.

Ability to work with distributed teams in a collaborative and productive manner.

Benefits:

Competitive Salary Packages and benefits.

Collaborative, lively and an upbeat work environment with young professionals.

Job Category: Development

Job Type: Full Time

Job Location: Bangalore

 

Read more
Impetus Technologies

at Impetus Technologies

1 recruiter
Gangadhar T.M
Posted by Gangadhar T.M
Bengaluru (Bangalore)
4 - 8 yrs
₹20L - ₹35L / yr
skill iconData Science
Pricing Strategy
skill iconPython
Predictive analytics
Pricing models
+1 more
Looking for Data Scientist with strong expertise in Classical Machine Learning algorithms and strong expertise in SQL and Python.
Experience in Pricing models will be definite plus
Read more
Deep-Rooted.co (formerly Clover)

at Deep-Rooted.co (formerly Clover)

6 candid answers
1 video
Likhithaa D
Posted by Likhithaa D
Bengaluru (Bangalore)
3 - 6 yrs
₹12L - ₹15L / yr
skill iconJava
skill iconPython
SQL
AWS Lambda
HTTP
+5 more

Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.


Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.


Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.  

How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with thehttps://www.linkedin.com/in/gururajsrao/"> Product Leader (Founder) andhttps://www.linkedin.com/in/sriki77/"> CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.


We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.

Purpose of the role:

* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making.
* Handle nuances of Excel and Google Sheets API.
* Pull data in and manage it growth, freshness and correctness.
* Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads.
* Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.

Technical expertise:
* Good Knowledge And Experience with Programming languages - Java, SQL,Python.
* Good Knowledge of Data Warehousing, Data Architecture.
* Experience with Data Transformations and ETL; 
* Experience with API tools and more closed systems like Excel, Google Sheets etc.
* Experience AWS Cloud Platform and Lambda
* Experience with distributed data processing tools.
* Experiences with container-based deployments on cloud.

Skills:
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
Read more
Remote, Bengaluru (Bangalore), Hyderabad
0 - 1 yrs
₹2.5L - ₹4L / yr
SQL
Data engineering
Big Data
skill iconPython
● Hands-on Work experience as a Python Developer
● Hands-on Work experience in SQL/PLSQL
● Expertise in at least one popular Python framework (like Django,
Flask or Pyramid)
● Knowledge of object-relational mapping (ORM)
● Familiarity with front-end technologies (like JavaScript and HTML5)
● Willingness to learn & upgrade to Big data and cloud technologies
like Pyspark Azure etc.
● Team spirit
● Good problem-solving skills
● Write effective, scalable code
Read more
A Chemical & Purifier Company headquartered in the US.

A Chemical & Purifier Company headquartered in the US.

Agency job
via Multi Recruit by Fiona RKS
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹18L / yr
Azure data factory
Azure Data factory
Azure Data Engineer
SQL
SQL Azure
+2 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.


Basic Qualifications
  • 2+ years of experience in a Data Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with data pipeline and workflow management tools
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
Read more
Yottaasys AI LLC

at Yottaasys AI LLC

5 recruiters
Dinesh Krishnan
Posted by Dinesh Krishnan
Bengaluru (Bangalore), Singapore
2 - 5 yrs
₹9L - ₹20L / yr
skill iconData Science
skill iconDeep Learning
skill iconR Programming
skill iconPython
skill iconMachine Learning (ML)
+2 more
We are a US Headquartered Product Company looking to Hire a few Passionate Deep Learning and Computer Vision Team Players with 2-5 years of experience! If you are any of these:
1. Expert in deep learning and machine learning techniques,
2. Extremely Good in image/video processing,
3. Have a Good understanding of Linear algebra, Optimization techniques, Statistics and pattern recognition.
Then u r the right fit for this position.
Read more
A Fintech startup in Dubai

A Fintech startup in Dubai

Agency job
via Jobbie by Sourav Nandi
Remote, Dubai, Bengaluru (Bangalore), Mumbai
2 - 18 yrs
₹14L - ₹38L / yr
skill iconData Science
skill iconPython
skill iconR Programming
RESPONSIBILITIES AND QUALIFICATIONS The mission of the Marcus Surveillance Analytics team is to deliver a platform which detects security incidents which have a tangible business impact and actionable response. You will work alongside industry leading technologists from who have recently joined Goldman from across consumer security, technology, fintech, finance and quant firms. The role has a broad scope which will involve interacting with senior leaders of Goldman and the Consumer business on a regular basis. The position is hands-on and requires a driven and “take ownership” oriented individual who is intently focused on execution. You will work directly with developers, business leaders, vendors and partners in order to deliver security assets to the consumer business. Develop a team, vision and platform which identifies/prioritizes actionable security & fraud risks which have tangible businesses impact across Goldman's consumer and commercial banking businesses. Develop response and recovery technology and programs to ensure resilience from fraud and abuse events. Manage, develop and operationalize analytics which discover security & fraud events and identifies risks for all of Goldman's consumer businesses. Partner with fraud / abuse operations and leadership to ensure consumer fraud rates are within industry norms and own outcomes related to fraud improvements. Skills And Experience We Are Looking For BA/BS degree in Computer Science, Cybersecurity, or other relevant Computer/Data/Engineering degrees 2+ years of experience as a security professional or data analyst/scientist/engineer Python, PySpark, R, Bash, SQL, Splunk (search, ES, UBA) Experience with cloud infrastructure/big data tool sets Visualization tools such as Tableau or D3 Research and development to create innovative predictive detections for security and fraud Build a 24/7 real-time monitoring system with long term vision for scaling to new lines of consumer businesses Strong focus on customer experience and product usability Ability to work closely with the business, fraud, and security incident response teams on creating actionable detections
Read more
Carmozo

at Carmozo

2 recruiters
S Mallikarjun
Posted by S Mallikarjun
Bengaluru (Bangalore)
2 - 7 yrs
₹3L - ₹5L / yr
Image processing
skill iconMachine Learning (ML)
skill iconPython
Carmozo.com is a market place service company that addresses car care needs. We connect car owners to mechanics/workshops . We are operational in Bangalore and Hyderabad. Requirements 1) Experience with Image processing frameworks , pattern matching algorithms, 2) Experience with machine learning toolkits
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort