Scala Jobs in Mumbai

Explore top Scala Job opportunities in Mumbai from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon
icon
Mumbai
icon
2 - 3 yrs
icon
₹8L - ₹12L / yr
Java
C++
Scala
Spark

LogiNext is looking for a technically savvy and passionate Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.

In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.

Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.

Responsibilities:

Identify valuable data sources and automate collection processes Undertake preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams


Requirements:

Bachelors degree or higher in Computer Science, Information Technology, Information Systems, Statistics, Mathematics, Commerce, Engineering, Business Management, Marketing or related field from top-tier school 2 to 3 year experince in in data mining, data modeling, and reporting. Understading of SaaS based products and services. Understanding of machine-learning and operations research Experience of R, SQL and Python; familiarity with Scala, Java or C++ is an asset Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop) Analytical mind and business acumen and problem-solving aptitude Excellent communication and presentation skills Proficiency in Excel for data management and manipulation Experience in statistical modeling techniques and data wrangling Able to work independently and set goals keeping business objectives in mind

Read more
icon
Mumbai
icon
4 - 7 yrs
icon
₹12L - ₹19L / yr
Machine Learning (ML)
Data Science
PHP
Java
Spark
+1 more

LogiNext is looking for a technically savvy and passionate Senior Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.

In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.

Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.

Responsibilities :

Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties


Requirements:

Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 4 to 7 years of experience in data mining, data modeling, and reporting 3+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experince in team management.

Read more

Opportunity with top conglomerate

Agency job
via Seven N Half by Gurpreet Desai
icon
Mumbai, Gurugram
icon
3 - 9 yrs
icon
₹5L - ₹30L / yr
Scala
SQL
Spark
Hadoop
Big Data
+6 more
Functional / Technical Skills:

- Big data development experience – Kafka, Hadoop
- Experience building data pipelines using Spark and/or Hive.
- Strong knowledge of Python
- Advanced proficiency in Scala, SQL, NoSQL
- strong in Database and Data Warehousing concepts
- Expertise in SQL, SQL tuning, schema design, Python and ETL processes
- Experience with Cloud Technologies required Azure, Data modeling,
Azure Databricks, Azure Data factory )
- Experience in working with Azure Data Lake, and Stream Analytics.
- Highly Motivated, Self-starter and quick learner
- Proficiency in Statistical procedures, Experiments and Machine Learning
techniques
- Must know the basics of data analytics and data modeling
- Excellent written and verbal communication skills

Roles/Responsibilities:

- Active involvement in the building of a recommendation engine
- Design new processes and builds large, complex data sets
- Conducts statistical modeling and experiment design
- Tests and validates predictive models.
- Build web prototypes and performs data visualization
- Generate algorithms and create computer models
- Should possess excellent analytical skills and troubleshooting ideas
- Should be aware of Agile Mode of operations and should have been
part of scrum teams.
- Should be open to work in the DevOps model with responsibilities of Dev
and Support both as the application goes live
- Should be able to work in shifts (if required)
- Should be open to working on fast-paced, projects with multiple
stakeholders.
Read more

Tier 1 MNC

Agency job
icon
Chennai, Pune, Bengaluru (Bangalore), Noida, Gurugram, Kochi (Cochin), Coimbatore, Hyderabad, Mumbai, Navi Mumbai
icon
3 - 12 yrs
icon
₹3L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more
Greetings,
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
Read more

This company provides on-demand cloud computing platforms.

Agency job
via New Era India by Niharica Singh
icon
Remote, Pune, Mumbai, Bengaluru (Bangalore), Gurugram, Hyderabad
icon
15 - 25 yrs
icon
₹35L - ₹55L / yr
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure
Architecture
Python
+5 more
  • 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
  • 15+ years of experience as a technical specialist in Customer-facing roles.
  • Ability to travel to client locations as needed (25-50%)
  • Extensive experience architecting, designing and programming applications in an AWS Cloud environment
  • Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
  • Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
  • Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
  • Agile software development expert
  • Experience with continuous integration tools (e.g. Jenkins)
  • Hands-on familiarity with CloudFormation
  • Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
  • Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
  • Strong practical application development experience on Linux and Windows-based systems
  • Extra curricula software development passion (e.g. active open source contributor)
Read more
DP
Posted by Shanu Mohan
icon
Gurugram, Mumbai, Bengaluru (Bangalore)
icon
2 - 4 yrs
icon
₹10L - ₹17L / yr
Python
PySpark
Amazon Web Services (AWS)
Spark
Scala
+2 more
  • Hands-on experience in any Cloud Platform
· Versed in Spark, Scala/python, SQL
  • Microsoft Azure Experience
· Experience working on Real Time Data Processing Pipeline
Read more
DP
Posted by Jhalak Doshi
icon
Remote, Mumbai
icon
4 - 6 yrs
icon
₹1L - ₹15L / yr
Scala
Java
Akka
MySQL
Google Cloud Platform (GCP)
+2 more

Job Description:

TeamExtn is looking for a passionate Senior Scala Engineer. It will be expected from you to build pragmatic solutions on mission-critical initiatives. If you know your stuff, see the beauty in code, have knowledge in depth and breadth, advocate best practices, and love to work with distributed systems, then this is an ideal position for you.

As a core member of our Special Projects team, you will work on various new projects in a startup-like environment. These projects may include such things as building new APIs (REST/GraphQL/gRPC) for new products, integrating new products with core Carvana services, building highly scalable back end processing systems in Scala and integrating with systems backed by Machine Learning. You will use cutting edge functional Scala libraries such as ZIO. You will have the opportunity to work closely with our Product, Experience and Software Engineering teams to deliver impact.

Responsibilities:

  • Build highly scalable APIs and back end processing systems for new products
  • Contribute in the full software development lifecycle from design and development to testing and operating in production
  • Communicate effectively with engineers, product managers and data scientists
  • Drive scalability and performance within our distributed AI platform
  • Full software development lifecycle from design and development to testing and operating in production
  • Communicate effectively with engineers, product managers and data scientists

Skills And Experience:

  • 4+ years experience with Scala, Java or other functional language
  • Experience with Akka and Lightbend stack
  • Expert with PostgreSQL, MySQL or MS SQL
  • Experience in architecting, developing, deploying and operating large scale distributed systems and actor systems
  • Experience with cloud APIs (e.g., GCP, AWS, Azure)
  • Messaging systems such as GCP Pub/Sub, RabbitMQ, Kafka
  • Strong foundation in algorithms and data structures and their real-world use cases.
  • Solid understanding of computer systems and networks
  • Production quality coding standards and patterns

 

BONUS SKILLS:

  • Experience with functional programming in Scala
  • Knowledge of ZIO and related ecosystem
  • Experience with functional database libraries in Scala (Quill preferred)
  • Kubernetes and Docker
  • Elasticsearch
  • Typescript, React and frontend UI development experience
  • gRPC, GraphQL
Read more
icon
Mumbai, Pune
icon
8 - 14 yrs
icon
₹10L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more
Job Responsibilities
1. Understand the business problem and translate these to data services and
engineering outcomes.
2. Expertise in working on cloud application designs, cloud approval plans, and
systems required to manage cloud storage.
3. Explore new technologies and learn new techniques to solve business problems
creatively
4. Collaborate with different teams - engineering and business, to build better data
products
5. Regularly evaluate cloud applications, hardware, and software.
6. Respond to technical issues in a professional and timely manner.
7. Identify the top cloud architecture solutions to successfully meet the strategic
needs of the company.
8. Offer guidance in infrastructure movement techniques including bulk application
transfers into the cloud.
9. Manage team and handle delivery of 2-3 projects
JD | Data Architect 24-Aug-2021
Solving for better 3 of 7

Qualifications
Is Education overrated? Yes. We believe so. But there is no way to locate you
otherwise. So we might look for at least a computer science, computer engineering,
information technology, or relevant field along with:

1. Over 4-6 years of experience in Data handling
2. Hands-on experience of any one programming language (Python, Java, Scala)
3. Understanding of SQL is must
4. Big data (Hadoop, Hive, Yarn, Sqoop)
5. MPP platforms (Spark, Presto)
6. Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)
7. Streaming engines (Kafka, Storm, Spark Streaming)
8. Any Relational database or DW experience
9. Any ETL tool experience
10. Hands-on experience in pipeline design, ETL and application development
11. Hands-on experience in cloud platforms like AWS, GCP etc.
12. Good communication skills and strong analytical skills
13. Experience in team handling and project delivery
Read more

Payments bank

Agency job
icon
Navi Mumbai, Mumbai
icon
5 - 7 yrs
icon
₹7L - ₹18L / yr
Vue.js
AngularJS (1.x)
Angular (2+)
React.js
Javascript
+11 more
Job location- Navi Mumbai
  • Professional experience in enterprise java software development using Spring MVC framework , RESTful  APIs and SOA
  • Experience working in Cloud(AWS)
  • Outstanding problem solving skills
  • API Development experience
  • Exposure to monitoring tools such as ELK, Splunk
  • Experience with Selenium for UI automated tests written in Cucumber or Scala
  •  Able to handle day-to-day challenges and owning the resolution of issues as they arise.
Read more
Agency job
via Response Informatics by Swagatika Sahoo
icon
Hyderabad, Pune, Chennai, Bengaluru (Bangalore), Mumbai
icon
5 - 6 yrs
icon
₹18L - ₹25L / yr
Big Data
Big Data Engineer
Spark
Apache Spark
Scala
+2 more

 

Experience :5--6+

 

 

Must Have

 

  • Apache Spark, Spark Streaming, Scala Programming, Apache HBASE   
  • Unix Scripting, SQL Knowledge
  • Good to Have.
  • Experience working with Graph Database preferably JANUS Graph DB
  • Experience working with Document Databases  and Apache SOLR

 

Job Description

Data Engineer  with Experience in the following area.

 

  • Designing and implementing high performance data ingestion pipelines from multiple sources using Scala and Apache Spark. 
  • Experience with event based  Spark Streaming technologies to ingest data. 
  • Developing Scalable and re-usable frameworks for ingesting data sets.
  • Integrating end to end data pipelines to take data from source systems to target data repositories ensuring the quality and consistency of data maintained at all times.
  • Preference for Big Data related Certifications like Cloudera Certified Professional CCP and Cloudera Certified Associate CCA
  • Working within Agile delivery methodology to deliver product implementation in iterative sprints.
  • Strong knowledge of Data Management principles

 

Location: PAN INDIA

Read more
icon
Mumbai
icon
0 - 3 yrs
icon
₹4L - ₹15L / yr
C++
Object Oriented Programming (OOPs)
C
Java
Data Structures
+5 more
  • Augmenting, improving, redesigning, and/or re-implementing Dolat's low-latency/high-throughput production trading environment, which collects data from and disseminates orders to exchanges around the world
  • Optimizing this platform by using network and systems programming, as well as other advanced techniques
  • Developing systems that provide easy access to historical market data and trading simulations
  • Building risk-management and performance-tracking tools
  • Shaping the future of Dolat through regular interviewing and infrequent campus recruiting trips
  • Implementing domain-optimized data structures
  • Learn and internalize the theories behind current trading system
  • Participate in the design, architecture and implementation of automated trading systems
  • Take ownership of system from design through implementation
Read more
DP
Posted by Ravi Mevcha
icon
Mumbai, Navi Mumbai
icon
2 - 4 yrs
icon
₹8L - ₹12L / yr
Spark
Big Data
ETL
Data engineering
ADF
+4 more

Job Overview


We are looking for a Data Engineer to join our data team to solve data-driven critical

business problems. The hire will be responsible for expanding and optimizing the existing

end-to-end architecture including the data pipeline architecture. The Data Engineer will

collaborate with software developers, database architects, data analysts, data scientists and platform team on data initiatives and will ensure optimal data delivery architecture is

consistent throughout ongoing projects. The right candidate should have hands on in

developing a hybrid set of data-pipelines depending on the business requirements.

Responsibilities

  • Develop, construct, test and maintain existing and new data-driven architectures.
  • Align architecture with business requirements and provide solutions which fits best
  • to solve the business problems.
  • Build the infrastructure required for optimal extraction, transformation, and loading
  • of data from a wide variety of data sources using SQL and Azure ‘big data’
  • technologies.
  • Data acquisition from multiple sources across the organization.
  • Use programming language and tools efficiently to collate the data.
  • Identify ways to improve data reliability, efficiency and quality
  • Use data to discover tasks that can be automated.
  • Deliver updates to stakeholders based on analytics.
  • Set up practices on data reporting and continuous monitoring

Required Technical Skills

  • Graduate in Computer Science or in similar quantitative area
  • 1+ years of relevant work experience as a Data Engineer or in a similar role.
  • Advanced SQL knowledge, Data-Modelling and experience working with relational
  • databases, query authoring (SQL) as well as working familiarity with a variety of
  • databases.
  • Experience in developing and optimizing ETL pipelines, big data pipelines, and datadriven
  • architectures.
  • Must have strong big-data core knowledge & experience in programming using Spark - Python/Scala
  • Experience with orchestrating tool like Airflow or similar
  • Experience with Azure Data Factory is good to have
  • Build processes supporting data transformation, data structures, metadata,
  • dependency and workload management.
  • Experience supporting and working with cross-functional teams in a dynamic
  • environment.
  • Good understanding of Git workflow, Test-case driven development and using CICD
  • is good to have
  • Good to have some understanding of Delta tables It would be advantage if the candidate also have below mentioned experience using
  • the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Hive, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with cloud data services
  • Experience with object-oriented/object function scripting languages: Python, Scala, etc.
Read more
DP
Posted by Neeta Singh Mehta
icon
Bengaluru (Bangalore), Hyderabad, Noida, gurugram, Mumbai, Pune, NCR (Delhi | Gurgaon | Noida)
icon
3 - 8 yrs
icon
₹6L - ₹32L / yr
Java
Python
Ruby
PHP
C++
+9 more

Greetings from Amazon...!                                       

 

It is our pleasure in personally inviting you to apply job with Amazon Development Centre India (ADCI). At Amazon we are inclined to hire people with passion for technology and you happened to be one of the shortlisted candidates. Our business is committed to recognizing potential and creating teams that embrace innovation.

 

Please find the Eligible criteria and requirements:

 

Job title                             :             SDE – II (Software Development Engineer)
Role Opportunity
            :             Permanent/Full Time/FTE/Regular

Work Location                 :             Hyderabad/Bangalore/ Gurgaon

 

Must Have

  • Strong Exposure to Data Structures, Algorithms, Coding, System Design (LLD, HLD, OOAD), Distributed systems, problem solving skills, Architecture (MVC/Microservices), logical thinking.

Amazon (ADCI) - If you are looking for an opportunity to solve deep technical problems and build innovative solutions in a fast paced environment working with smart, passionate software developers, this might be the role for you. Amazon’s transportation systems get millions of packages to customers worldwide faster and cheaper while providing world class customer experience – from checkout to shipment tracking to delivery. Our software systems include services that handle thousands or requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve experience for millions of online shoppers. With rapid expansion into new geographies, innovations in supply chain, delivery models and customer experience, increasingly complex transportation network, ever expanding selection of products and growing number of shipments worldwide, we have an opportunity to build software that scales the business, leads the industry through innovation and delights millions of customers worldwide.

 

As an SDE, you will develop a deep understanding of our business, work closely with development teams and own the architecture and end-to-end delivery of software components.

About Amazon India:

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.

We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India. Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

 

Basic Qualifications:

 

  • 3+ years’ experience building successful production software systems
  • A solid grounding in Computer Science fundamentals (based on a BS or MS in CS or related field)
  • The ability to take convert raw requirements into good design while exploring technical feasibility tradeoffs
  • Expertise in System design (design patterns, LLD, HLD, Solid principle, OOAD, Distributed systems etc..), Architecture (MVC/Micro services).
  • Good understanding of at least some of the modern programming languages (Java) and open-source technologies (C++, Python, Scala, C#, PHP, Ruby etc..)
  • Excellence in technical communication
  • Has experience in mentoring other software developers

 

Preferred Qualifications:

 

  • BS/MS in Computer Science or equivalent
  • Experience developing service oriented architectures and an understanding of design for scalability, performance and reliability
  • Demonstrated ability to mentor other software developers to maintain architectural vision and software quality
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment
  • Expertise in delivering high-quality, innovative application
  • Strong desire to build, sense of ownership, urgency, and drive
  • Strong organizational and problem solving skills with great attention to detail
  • Ability to triage issues, react well to changes, work with teams and ability to multi-task on multiple products and projects.
  • Experience building highly scalable, high availability services
  • The ideal candidate will be a visionary leader, builder and operator.
  • He/she should have experience leading or contributing to multiple simultaneous product development efforts and initiatives.
  • He/she needs to balance technical leadership with strong business judgment to make the right decisions about technology choices.
  • He/she needs to be constantly striving for simplicity, and at the same time demonstrate significant creativity, innovation and judgment
  • Proficiency in, at least, one modern programming language.
  • Experience in SQL or Non-SQL database.
  • Strong sense of ownership, urgency, and drive.
  • Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices.
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
  • Excellent communication, collaboration, reporting, analytical and problem solving skills.

 

 

Good to Have:

  • Knowledge of professional software engineering practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience with enterprise-wide systems
  • Experience influencing software engineers best practices within your team
  • Hands-on expertise in many disparate technologies, typically ranging from front-end user interfaces through to back-end systems and all points in between
  • Strong written and verbal communication skills preferred

 

Key Points to remember:

 

  • Strong knowledge of the Software Development Life Cycle methodology
  • Technical design, development and implementation decisions on the use of technology in area(s) of specialization.
  • Write or modify programming code to suit customer's needs.
  • Unit test to assure meets requirements, including integration test as needed.
  • Ability to understand and analyze issues and uses judgment to make decisions.
  • Strong problem solving & troubleshooting skills
  • Strong communication skills
  • Responsible for self-development according to professional development plan

 

 

 

 

 

 

Read more

India's leading sports platform

Agency job
icon
Mumbai
icon
3 - 7 yrs
icon
₹25L - ₹50L / yr
NodeJS (Node.js)
Scala
Rust
Your Role:
Understanding and solving real business needs at a large scale by applying your analytical problem-solving skills
Designing & building solutions for edge layers applications like GraphQL
Identifying and optimising performance bottlenecks
Architecting and building a robust, scalable, and highly available solutions for use cases like real time updates, data parsing and aggregation
Leading cross-functional initiatives and collaborating with engineers across teams
Must Have:
Hands on experience in Scala, NodeJs or Rust
A strong problem-solving skill and reasoning ability
Good to Have:
Experience in developing, performant & high throughput systems
Strong system designing skills, preferably in designing edge layer applications
Experience in functional programming, preferably with a working knowledge of type classes
Experience in writing testable programs.
Experience in working with the AWS stack
Prior experience with GraphQL
Experience in identifying and optimising hotspots and performance bottlenecks
An understanding of operating systems and networking fundamentals

Note: Applications accepted only from candidates who have worked in product based companies 
Read more

SAP company

Agency job
icon
Mumbai, Navi Mumbai
icon
3 - 8 yrs
icon
₹7L - ₹13L / yr
Data engineering
Apache Kafka
Apache Spark
Hadoop
apache flink
+7 more
Build data systems and pipelines using Apache Flink (or similar) pipelines.
Understand various raw data input formats, build consumers on Kafka/ksqldb for them and ingest large amounts of raw data into Flink and Spark.
Conduct complex data analysis and report on results.
Build various aggregation streams for data and convert raw data into various logical processing streams.
Build algorithms to integrate multiple sources of data and create a unified data model from all the sources.
Build a unified data model on both SQL and NO-SQL databases to act as data sink.
Communicate the designs effectively with the fullstack engineering team for development.
Explore machine learning models that can be fitted on top of the data pipelines.

Mandatory Qualifications Skills:

Deep knowledge of Scala and Java programming languages is mandatory
Strong background in streaming data frameworks (Apache Flink, Apache Spark) is mandatory
Good understanding and hands on skills on streaming messaging platforms such as Kafka
Familiarity with R, C and Python is an asset
Analytical mind and business acumen with strong math skills (e.g. statistics, algebra)
Problem-solving aptitude
Excellent communication and presentation skills
Read more
DP
Posted by Reena Bandekar
icon
Mumbai
icon
5 - 9 yrs
icon
₹15L - ₹22L / yr
Big Data
Hadoop
Spark
Apache Hive
ETL
+7 more
JD of Data Architect
As a Data Architect, you work with business leads, analysts and data scientists to understand the business domain and manage data engineers to build data products that empower better decision making. You are passionate about data quality of our business metrics and flexibility of your solution that scales to respond to broader business questions.
If you love to solve problems using your skills, then come join the Team Searce. We have a
casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.

What You’ll Do
● Understand the business problem and translate these to data services and engineering
outcomes
● Explore new technologies and learn new techniques to solve business problems
creatively
● Collaborate with many teams - engineering and business, to build better data products
● Manage team and handle delivery of 2-3 projects

What We’re Looking For
● Over 4-6 years of experience with
○ Hands-on experience of any one programming language (Python, Java, Scala)
○ Understanding of SQL is must
○ Big data (Hadoop, Hive, Yarn, Sqoop)
○ MPP platforms (Spark, Presto)
○ Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)
○ Streaming engines (Kafka, Storm, Spark Streaming)
○ Any Relational database or DW experience
○ Any ETL tool experience
● Hands-on experience in pipeline design, ETL and application development
● Hands-on experience in cloud platforms like AWS, GCP etc.
● Good communication skills and strong analytical skills
● Experience in team handling and project delivery
Read more
DP
Posted by Reena Bandekar
icon
Mumbai
icon
5 - 12 yrs
icon
₹10L - ₹20L / yr
Big Data
Hadoop
Apache Hive
Architecture
Data engineering
+4 more
JD of Data Engineer
As a Data Engineer, you are a full-stack data engineer that loves solving business problems.
You work with business leads, analysts and data scientists to understand the business domain
and engage with fellow engineers to build data products that empower better decision making.
You are passionate about data quality of our business metrics and flexibility of your solution that
scales to respond to broader business questions.
If you love to solve problems using your skills, then come join the Team Searce. We have a
casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses
on productivity and creativity, and allows you to be part of a world-class team while still being
yourself.

What You’ll Do
● Understand the business problem and translate these to data services and engineering
outcomes
● Explore new technologies and learn new techniques to solve business problems
creatively
● Think big! and drive the strategy for better data quality for the customers
● Collaborate with many teams - engineering and business, to build better data products

What We’re Looking For
● Over 1-3 years of experience with
○ Hands-on experience of any one programming language (Python, Java, Scala)
○ Understanding of SQL is must
○ Big data (Hadoop, Hive, Yarn, Sqoop)
○ MPP platforms (Spark, Pig, Presto)
○ Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)
○ Streaming engines (Kafka, Storm, Spark Streaming)
○ Any Relational database or DW experience
○ Any ETL tool experience
● Hands-on experience in pipeline design, ETL and application development
Read more
DP
Posted by Sunil Bolisetty
icon
Mumbai
icon
4 - 10 yrs
icon
₹4L - ₹30L / yr
Scala
Java
Python
  1. Should be very strong Scala development(Coding)
  2. With Any combination of Java/Python/Spark/Bigdata
  3. 3+ years experience in Core Java/Scala with good understanding of multithreading
  4. The candidate must be good with Computer Science fundamentals
  5. Exposure to python/perl and Unix / K-Shell scripting
  6. Code management tools such as Git/Perforce.
  7. Experience with large batch-oriented systems 
  8. DB2/Sybase or any RDBMS
  9. Prior experience with financial products, particularly OTC Derivatives
  10. Exposure to counterparty risk, margining, collateral or confirmation systems
Read more
DP
Posted by Poonam More
icon
Mumbai
icon
1 - 3 yrs
icon
₹1.8L - ₹3L / yr
.NET
ASP.NET
Java
Data Structures
Algorithms
+7 more

Job description

Responsibilities

  • Participate in requirements analysis
  • Collaborate with internal teams to produce software design and architecture
  • Write clean, scalable code using .NET programming languages
  • Test and deploy applications and systems
  • Revise, update, refactor and debug code
  • Improve existing software
  • Develop documentation throughout the software development life cycle (SDLC)
  • Serve as an expert on applications and provide technical support

 

Requirements

  • Proven experience as a .NET Developer or Application Developer
  • Familiarity with the ASP.NET framework, SQL Server and design/architectural patterns (e.g. Model-View-Controller (MVC))
  • Knowledge of at least one of the .NET languages (e.g. C#, Visual Basic .NET) and HTML5/CSS3
  • Familiarity with architecture styles/APIs (REST, RPC)
  • Understanding of Agile methodologies
  • Excellent troubleshooting and communication skills
  • Attention to detail
  • BSc/BA in Computer Science, Engineering or a related field

 

No. Of Requirement: 3

 

Experience: 1-3 yrs

Location- Goregaon, Mumbai.

Urgent Joining.

Required Skills:

.Net Framework 3.5

C#

Visual Studio 2018

Debugging

Asp.net MVC

EntityFramework

Jquery

Bootstrap

HTML

CSS

Javascript

Read more
icon
Remote, Chennai, Mumbai, Bengaluru (Bangalore)
icon
2 - 7 yrs
icon
₹2.4L - ₹3.6L / yr
Java
Software Development
Data Structures
Algorithms
Scala
+14 more
Impactree Data Technologies is a young social enterprise working to support social organisations scale development programmes using technology of data monitoring and analytics on a real time basis. Impactree specifically works at building data intellegence for two specific sectors - rural livelihoods and education. Our team consists of members qualified from London Business School, IIM Khozikode, ICAI, Anna University etc.

We are an upcoming profitable social enterprise and as a a part of the team we are looking for a candidate who can work with our team to build better analytics and intellegence into our platform Prabhaav. 

We are looking for a Software Developer to build and implement functional programs. You will work with other Developers and Product Managers throughout the software development life cycle.


In this role, you should be a team player with a keen eye for detail and problem-solving skills. If you also have experience in Agile frameworks and popular coding languages (e.g. JavaScript).


Your goal will be to build efficient programs and systems that serve user needs. 

Technical Skills we are looking for are:

 

  • Producing clean, efficient code based on specifications
  • Coding Abilities in HTML , PHP , JS , JSP – Server let , JAVA , DevOps(basic Knowledge).
  • Additional Skills (preferred) : NodeJS , Python , Angular JS .
  • System Administrator Experience : Linux (Ubuntu/RedHat) , Windows CE-Embedded.
  • Data Base Experience : MySQL , Posgres , Mongo DB.
  • Data Format Experience : JSON , XML , AJAX , JQuery.
  • Should have Depth in software Architecture Design especially for Stand-Alone Software As Product , or SaaS Platform Experience.
  • Should have Basic Experience/knowledge in Micro-Services , Rest API’s and SOAP methodologies.
  • Should have built some backend architecture for Long Standing Applications.
  • Good HTML Design Sense.
  • Experience with AWS Services like EC2 and LightSail is Preferred.
  • Testing and deploying programs and systems
  • Fixing and improving existing software
  • Good Understanding of OOP’s and Similar Concepts.
  • Research on New JS Methodologies like React Js and Angular Js

 

Experience areas we are looking for: 
  • Proven experience as a Software Developer, Software Engineeror similar role
  • Familiarity with Agile development methodologies
  • Experience with software design and development in a test-driven environment
  • Knowledge of coding languages (e.g. Java, JavaScript) and frameworks/systems (e.g. AngularJS, Git)
  • Experience with databases and Object-Relational Mapping (ORM) frameworks (e.g. Hibernate)
  • Ability to learn new languages and technologies
  • Excellent communication skills
  • Resourcefulness and troubleshooting aptitude
  • Attention to detail

 

Read more
DP
Posted by Seema Pahwa
icon
Mumbai
icon
2 - 6 yrs
icon
₹6L - ₹15L / yr
Big Data
Spark
Scala
Amazon Web Services (AWS)
Apache Kafka

 

The Data Engineering team is one of the core technology teams of Lumiq.ai and is responsible for creating all the Data related products and platforms which scale for any amount of data, users, and processing. The team also interacts with our customers to work out solutions, create technical architectures and deliver the products and solutions.

If you are someone who is always pondering how to make things better, how technologies can interact, how various tools, technologies, and concepts can help a customer or how a customer can use our products, then Lumiq is the place of opportunities.

 

Who are you?

  • Enthusiast is your middle name. You know what’s new in Big Data technologies and how things are moving
  • Apache is your toolbox and you have been a contributor to open source projects or have discussed the problems with the community on several occasions
  • You use cloud for more than just provisioning a Virtual Machine
  • Vim is friendly to you and you know how to exit Nano
  • You check logs before screaming about an error
  • You are a solid engineer who writes modular code and commits in GIT
  • You are a doer who doesn’t say “no” without first understanding
  • You understand the value of documentation of your work
  • You are familiar with Machine Learning Ecosystem and how you can help your fellow Data Scientists to explore data and create production-ready ML pipelines

 

Eligibility

Experience

  • At least 2 years of Data Engineering Experience
  • Have interacted with Customers


Must Have Skills

  • Amazon Web Services (AWS) - EMR, Glue, S3, RDS, EC2, Lambda, SQS, SES
  • Apache Spark
  • Python
  • Scala
  • PostgreSQL
  • Git
  • Linux


Good to have Skills

  • Apache NiFi
  • Apache Kafka
  • Apache Hive
  • Docker
  • Amazon Certification

 

 

Read more
DP
Posted by Aishwarya Hire
icon
Mumbai
icon
3 - 9 yrs
icon
₹5L - ₹12L / yr
Apache Hive
Hadoop
Scala
Spark
Amazon Web Services (AWS)
+2 more
Job Overview :

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.

Responsibilities and Duties :

- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.

- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions

Education level :

- Bachelor's degree in Computer Science or equivalent

Experience :

- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development

- Expertise in application, data and infrastructure architecture disciplines

- Expert designing data integrations using ETL and other data integration patterns

- Advanced knowledge of architecture, design and business processes

Proficiency in :

- Modern programming languages like Java, Python, Scala

- Big Data technologies Hadoop, Spark, HIVE, Kafka

- Writing decently optimized SQL queries

- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)

- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions

- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.

- Experience generating physical data models and the associated DDL from logical data models.

- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.

- Experience enforcing data modeling standards and procedures.

- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.

- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals

Skills :

Must Know :

- Core big-data concepts

- Spark - PySpark/Scala

- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)

- Handling of various file formats

- Cloud platform - AWS/Azure/GCP

- Orchestration tool - Airflow
Read more
DP
Posted by Aishwarya Hire
icon
Mumbai
icon
3 - 7 yrs
icon
₹7L - ₹20L / yr
Hadoop
Big Data
Scala
Spark
Amazon Web Services (AWS)
+3 more
Job Overview :

Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.

Responsibilities and Duties :

- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.

- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions

Education level :

- Bachelor's degree in Computer Science or equivalent

Experience :

- Minimum 5+ years relevant experience working on production grade projects experience in hands on, end to end software development

- Expertise in application, data and infrastructure architecture disciplines

- Expert designing data integrations using ETL and other data integration patterns

- Advanced knowledge of architecture, design and business processes

Proficiency in :

- Modern programming languages like Java, Python, Scala

- Big Data technologies Hadoop, Spark, HIVE, Kafka

- Writing decently optimized SQL queries

- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)

- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions

- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.

- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.

- Experience generating physical data models and the associated DDL from logical data models.

- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.

- Experience enforcing data modeling standards and procedures.

- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.

- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals

Skills :

Must Know :

- Core big-data concepts

- Spark - PySpark/Scala

- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)

- Handling of various file formats

- Cloud platform - AWS/Azure/GCP

- Orchestration tool - Airflow
Read more

Company is into Product Development.

Agency job
via Master Mind Consultancy by Dnyanesh Panchal
icon
Remote, Mumbai
icon
10 - 18 yrs
icon
₹30L - ₹55L / yr
Scala
Big Data
Java
Amazon Web Services (AWS)
ETL

What's the role?

Your role as a Principal Engineer will involve working with various team. As a principal engineer, will need full knowledge of the software development lifecycle and Agile methodologies. You will demonstrate multi-tasking skills under tight deadlines and constraints. You will regularly contribute to the development of work products (including analyzing, designing, programming, debugging, and documenting software) and may work with customers to resolve challenges and respond to suggestions for improvements and enhancements. You will setup the standard and principal for the product he/she drives.

  • Setup coding practice, guidelines & quality of the software delivered.
  • Determines operational feasibility by evaluating analysis, problem definition, requirements, solution development, and proposed solutions.
  • Documents and demonstrates solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code.
  • Prepares and installs solutions by determining and designing system specifications, standards, and programming.
  • Improves operations by conducting systems analysis; recommending changes in policies and procedures.
  • Updates job knowledge by studying state-of-the-art development tools, programming techniques, and computing equipment; participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations.
  • Protects operations by keeping information confidential.
  • Develops software solutions by studying information needs; conferring with users; studying systems flow, data usage, and work processes; investigating problem areas; following the software development lifecycle. Who are you? You are a go-getter, with an eye for detail, strong problem-solving and debugging skills, and having a degree in BE/MCA/M.E./ M Tech degree or equivalent degree from reputed college/university.

 

Essential Skills / Experience:

  • 10+ years of engineering experience
  • Experience in designing and developing high volume web-services using API protocols and data formats
  • Proficient in API modelling languages and annotation
  • Proficient in Java programming
  • Experience with Scala programming
  • Experience with ETL systems
  • Experience with Agile methodologies
  • Experience with Cloud service & storage
  • Proficient in Unix/Linux operating systems
  • Excellent oral and written communication skills Preferred:
  • Functional programming languages (Scala, etc)
  • Scripting languages (bash, Perl, Python, etc)
  • Amazon Web Services (Redshift, ECS etc)
Read more
DP
Posted by Damini Gawali
icon
Seawoo, Navi Mumbai, Mumbai
icon
1 - 7 yrs
icon
₹15L - ₹30L / yr
Java
Data Structures
PHP
Python
Ruby on Rails (ROR)
+9 more

Responsibilities:

  • Own end to end development and operations of high-performance Spring Hibernate Applications.
  • Design the architecture and deliver clean, testable and scalable code
  • Participate in requirement gathering and display a strong sense of ownership and delivery

 

Skills and Qualifications:

  • Strong in Data Structures, algorithms and Object Oriented Concepts, Message Queues and Caching
  • BE/ B.Tech preferred
Read more
icon
Thane, Mumbai
icon
3 - 8 yrs
icon
₹7L - ₹25L / yr
Go Programming (Golang)
Java
Data Structures
Algorithms
Scala
+4 more

Job Summary

  • Excellent hands-on experience with Go lang (if not Golang, in either JAVA, DotNet and/or NodeJS)
  • Write CRONs and background tasks required for the growth of business and product
  • Build REST APIs as required
  • Ability to code using design principles
  • Write reusable code and libraries for future use
  • Have the working knowledge of Microservices Architecture using Docker
  • Collaborate with other team members and stakeholders in executing various new and existing ideas
  • Possesses the knowledge of developing and deploying in Linux environments
  • Passion for building great products and loads of energy.

Key Skills
Skills that we would be more than happy for a dev to have: - Worked in CI/CD environments - Developed code using TDD/BDD approach. - Worked with Virtualization on Linux (KVM) - Experience in working in Agile development environment 

About You

We’re looking for exceptional Engineers with an amazing breadth and depth of technology expertise! If you’re the kind of person that looks at the bigger picture and want to build something that has a real impact on the end user, go ahead and apply for the position.

 

Ability to see the big picture but still love to code!

Strong in backend languages, such as Java, DotNet or NodeJS!

Familiar with client-side frameworks such as React, Angular, Vue etc.

Strong HTML/CSS skills – you understand not only how to build the data, but how to make it look great too.

Knowledge of architectural design and you like to build something scalable and flexible to support business

Agile or Scrum is your favorite development approach.

And when we start talking about performance, security and unit testing? Well that’s music to your ears

 

Read more
icon
Mumbai
icon
5 - 10 yrs
icon
₹12L - ₹20L / yr
Object Oriented Programming (OOPs)
Shell Scripting
Java
SOAP
JSON
+8 more
We are looking for a Java developer for one of our major investment banking client- who can take ownership for the whole end to end delivery, performing analysis, design, coding, testing and maintenance of large- scale and distributed applications. Please find JD for your reference . Job Profile : Java Developer : Location : Mumbai Description: A core Java developer is required for a Tier 1 investment bank supporting the Delta One Structured Products IT group. This is a global front-office team that supports the global OTC Equity Swap Portfolio, Single Name, and Index derivative businesses. We are designing a complete restructure of the Equity Swaps trading platform, and this particular role is within the core cash flow and valuations area. The role will require the candidate to work closely with the cash flow engines team to solve problems that combine both finance and technology. This is an exciting hands-on role for a self-starter who has a thirst for new challenges as well as new technologies. The candidate should possess good analytical skills, strong software engineering skills, a logical approach to problem-solving, be able to work in a fast paced environment liaising with demanding stakeholders to understand complex requirements and be able to prioritize work under pressure with minimal supervision. The candidate should be a problem solver, and be able to bring with them some positivity and enthusiasm in trying to think about and offer potential solutions for architectural considerations. Position Profile: We are looking for someone to help own problems and be able to demonstrate leadership and responsibility for the delivery of new features. As part of the development cycle, you would be expected to write quality unit tests, supply documentation if relevant for new feature build-outs, and be involved in the test cycle (UAT, integration, regression) for the delivery and fixing of bugs for your new features. Although the role is predominantly Java, we require someone who is flexible with the development environment, as some days you might be writing Java, and other days you might be fixing stored procedures or Perl scripts. You would be expected to get involved in the Level 3 production support rota which is shared between our developers on a monthly cycle, and to occasionally help with weekend deployment activities to deploy and verify any code changes you have been involved in. Team Profile: The team and role are ideal for someone looking for a strong career development path with many opportunities to grow, learn and develop. The role requires someone who is flexible and able to respond to a dynamic business environment. The candidate must be adaptable to work across multiple technologies and disciplines, with a focus on delivering quality solutions for the business in a timely fashion. This role suits people experienced in complex data domains. Required Skills: * Experience of agile and scrum methodologies. * Core Java. * Unix shell scripting. * SQL and Relational Databases such as DB2. * Integration technologies - MQ/Xml/SOAP/JSON/Protocol Buffers/Spring. * Enterprise Architecture Patterns, GoF design * Build & agile - Ant, Gradle/Maven, Sonar, Jenkins/Hudson, GIT/perforce. * Sound understanding of Object Oriented Analysis, Design and Programming. * Strong communication and stakeholder management skills * Scala / spark or bigdata will be an added advantage * Candidate must have good experience in database. * Excellent communication and problem solving skill. Desired Skills: * Experience in banking and regulatory reporting (SFTR, MAS/ASIC etc.) * Knowledge of OTC, listed and cash products * Domain driven design and micro-services
Read more
DP
Posted by Shiv Parekh
icon
Mumbai
icon
2 - 10 yrs
icon
₹10L - ₹30L / yr
React.js
Python
Java
Data Structures
Algorithms
+5 more
Collaborate with cofounders to guide strategic direction of product and company Responsible for entire product lifecycle from concept to deployment Build out user experience, design, development and QA capabilities through a combination of hiring and outsourcing to vendors Establish the initial architecture for the application will include database design and cloud infrastructure Estimate development release cycle and manage the release process Implement scrum based development cycles
Read more
DP
Posted by Neha Mayekar
icon
Mumbai
icon
5 - 14 yrs
icon
₹8L - ₹18L / yr
HDFS
Hbase
Spark
Flume
hive
+2 more
US based Multinational Company Hands on Hadoop
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort