Cutshort logo

36+ Scala Jobs in Pune | Scala Job openings in Pune

Apply to 36+ Scala Jobs in Pune on CutShort.io. Explore the latest Scala Job opportunities across top companies like Google, Amazon & Adobe.

icon
Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Soumya  Tripathy
Posted by Soumya Tripathy
Pune
11 - 17 yrs
Best in industry
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

About the role

As a full-stack engineer, you’ll feel at home if you are hands-on, grounded, opinionated and passionate about building things using technology. Our tech stack ranges widely with language ecosystems like Typescript, Java, Scala, Golang, Kotlin, Elixir, Python, .Net, Nodejs and even Rust.

This role is ideal for those looking to have a large impact and a huge scope for growth while still being hands-on with technology. We aim to allow growth without becoming “post-technical”. We are extremely selective with our consultants and are able to run our teams with fewer levels of management. You won’t find a BA or iteration manager here! We work in small pizza teams of 2-5 people where a well-founded argument holds more weight than the years of experience. You will have the opportunity to work with clients across domains like retail, banking, publishing, education, ad tech and more where you will take ownership of developing software solutions that are purpose-built to solve our clients’ unique business and technical needs.

Responsibilities

  • Produce high-quality code that allows us to put solutions into production.
  • Utilize DevOps tools and practices to build and deploy software.
  • Collaborate with Data Scientists and Engineers to deliver production-quality AI and Machine Learning systems.
  • Build frameworks and supporting tooling for data ingestion from a complex variety of sources. Work in short sprints to deliver working software with clear deliverables and client-led deadlines.
  • Willingness to be a polyglot developer and learn multiple technologies.

Skills you’ll need

  • A maker’s mindset. To be resourceful and have the ability to do things that have no instructions.
  • Extensive experience (at least 10 years) as a Software Engineer.
  • Deep understanding of programming fundamentals and expertise with at least one programming language (functional or object-oriented).
  • A nuanced and rich understanding of code quality, maintainability and practices like Test Driven Development.
  • Experience with one or more source control and build toolchains.
  • Working knowledge of CI/CD will be an added advantage.
  • Understanding of web APIs, contracts and communication protocols.
  • Understanding of Cloud platforms, infra-automation/DevOps, IaC/GitOps/Containers, design and development of large data platforms.

What will you experience in terms of culture at Sahaj?

  • A culture of trust, respect and transparency
  • Opportunity to collaborate with some of the finest minds in the industry
  • Work across multiple domains

What are the benefits of being at Sahaj?

  • Unlimited leaves
  • Life Insurance & Private Health insurance paid by Sahaj
  • Stock options
  • No hierarchy
  • Open Salaries
Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Pune, Hyderabad, Gurugram, Noida
5 - 11 yrs
₹20L - ₹36L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.


Role & Responsibilities:

Your role is focused on Design, Development and delivery of solutions involving:

• Data Integration, Processing & Governance

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Implement scalable architectural models for data processing and storage

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode

• Build functionality for data analytics, search and aggregation

Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 5+ years of IT experience with 3+ years in Data related technologies

2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc

6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Cloud data specialty and other related Big data technology certifications


Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes


Read more
Publicis Sapient

at Publicis Sapient

10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
NutaNXT Technologies

at NutaNXT Technologies

1 recruiter
Jidnyasa S
Posted by Jidnyasa S
Pune
6 - 9 yrs
₹15L - ₹28L / yr
Spark
skill iconScala
databricks,
NOSQL Databases

DATA ENGINEERING CONSULTANT


About NutaNXT: NutaNXT is a next-gen Software Product Engineering services provider building ground-breaking products using AI/ML, Data Analytics, IOT, Cloud & new emerging technologies disrupting the global markets. Our mission is to help clients leverage our specialized Digital Product Engineering capabilities on Data Engineering, AI Automations, Software Full stack solutions and services to build best-in-class products and stay ahead of the curve. You will get a chance to work on multiple projects critical to NutaNXT needs with opportunities to learn, develop new skills,switch teams and projects as you and our fast-paced business grow and evolve. Location: Pune Experience: 6 to 8 years


Job Description: NutaNXT is looking for supporting the planning and implementation of data design services, providing sizing and configuration assistance and performing needs assessments. Delivery of architectures for transformations and modernizations of enterprise data solutions using Azure cloud data technologies. As a Data Engineering Consultant, you will collect, aggregate, store, and reconcile data in support of Customer's business decisions. You will design and build data pipelines, data streams, data service APIs, data generators and other end-user information portals and insight tools.


Mandatory Skills: -


  1. Demonstrable experience in enterprise level data platforms involving implementation of end-to-end data pipelines with Python or Scala - Hands-on experience with at least one of the leading public cloud data platforms (Ideally Azure)
  2. - Experience with different Databases (like column-oriented database, NoSQL database, RDBMS)
  3. - Experience in architecting data pipelines and solutions for both streaming and batch integrations using tools/frameworks like Azure Databricks, Azure Data Factory, Spark, Spark Streaming, etc
  4. . - Understanding of data modeling, warehouse design and fact/dimension concepts - Good Communication


Good To Have:


Certifications for any of the cloud services (Ideally Azure)

• Experience working with code repositories and continuous integration • Understanding of development and project methodologies


Why Join Us?


We offer Innovative work in AI & Data Engineering Space, with a unique, diverse workplace environment having a Continuous learning and development opportunities. These are just some of the reasons we're consistently being recognized as one of the best companies to work for, and why our people choose to grow careers at NutaNXT. We also offer a highly flexible, self-driven, remote work culture which fosters the best of innovation, creativity and work-life balance, market industry-leading compensation which we believe help us consistently deliver to our clients and grow in the highly competitive, fast evolving Digital Engineering space with a strong focus on building advanced software products for clients in the US, Europe and APAC regions.

Read more
one of the world's leading multinational investment bank

one of the world's leading multinational investment bank

Agency job
via HiyaMee by Lithin Raj
Pune
5 - 9 yrs
₹5L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
This role is for a developer with strong core application or system programming skills in Scala, java and
good exposure to concepts and/or technology across the broader spectrum. Enterprise Risk Technology
covers a variety of existing systems and green-field projects.
A Full stack Hadoop development experience with Scala development
A Full stack Java development experience covering Core Java (including JDK 1.8) and good understanding
of design patterns.
Requirements:-
• Strong hands-on development in Java technologies.
• Strong hands-on development in Hadoop technologies like Spark, Scala and experience on Avro.
• Participation in product feature design and documentation
• Requirement break-up, ownership and implantation.
• Product BAU deliveries and Level 3 production defects fixes.
Qualifications & Experience
• Degree holder in numerate subject
• Hands on Experience on Hadoop, Spark, Scala, Impala, Avro and messaging like Kafka
• Experience across a core compiled language – Java
• Proficiency in Java related frameworks like Springs, Hibernate, JPA
• Hands on experience in JDK 1.8 and strong skillset covering Collections, Multithreading with

For internal use only
For internal use only
experience working on Distributed applications.
• Strong hands-on development track record with end-to-end development cycle involvement
• Good exposure to computational concepts
• Good communication and interpersonal skills
• Working knowledge of risk and derivatives pricing (optional)
• Proficiency in SQL (PL/SQL), data modelling.
• Understanding of Hadoop architecture and Scala program language is a good to have.
Read more
HCL Technologies

at HCL Technologies

3 recruiters
Agency job
via Saiva System by Sunny Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Mumbai, Kolkata
5 - 10 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
Exp- 5 + years
Skill- Spark and Scala along with Azure
Location - Pan India

Looking for someone Bigdata along with Azure
Read more
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry

consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry

Agency job
via Jobdost by Sathish Kumar
Ahmedabad, Hyderabad, Pune, Delhi
5 - 7 yrs
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
skill iconPython
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
GradMener Technology Pvt. Ltd.
Pune, Chennai
5 - 9 yrs
₹15L - ₹20L / yr
skill iconScala
PySpark
Spark
SQL Azure
Hadoop
+4 more
  • 5+ years of experience in a Data Engineering role on cloud environment
  • Must have good experience in Scala/PySpark (preferably on data-bricks environment)
  • Extensive experience with Transact-SQL.
  • Experience in Data-bricks/Spark.
  • Strong experience in Dataware house projects
  • Expertise in database development projects with ETL processes.
  • Manage and maintain data engineering pipelines
  • Develop batch processing, streaming and integration solutions
  • Experienced in building and operationalizing large-scale enterprise data solutions and applications
  • Using one or more of Azure data and analytics services in combination with custom solutions
  • Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Cloud repositories for e.g. Azure GitHub, Git
  • Experience in an agile environment (Prefer Azure DevOps).

Good to have

  • Manage source data access security
  • Automate Azure Data Factory pipelines
  • Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
  • Experience in implementing and maintaining CICD pipelines
  • Power BI understanding, Delta Lake house architecture
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
  • Expertise in database development projects and ETL processes.
Read more
EnterpriseMinds

at EnterpriseMinds

2 recruiters
phani kalyan
Posted by phani kalyan
Pune
9 - 14 yrs
₹20L - ₹40L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Job Id: SG0601

Hi,

Enterprise Minds is looking for Data Architect for Pune Location.

Req Skills:
Python,Pyspark,Hadoop,Java,Scala
Read more
Tier 1 MNC

Tier 1 MNC

Agency job
Chennai, Pune, Bengaluru (Bangalore), Noida, Gurugram, Kochi (Cochin), Coimbatore, Hyderabad, Mumbai, Navi Mumbai
3 - 12 yrs
₹3L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more
Greetings,
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
Read more
This company provides on-demand cloud computing platforms.

This company provides on-demand cloud computing platforms.

Agency job
via New Era India by Niharica Singh
Remote, Pune, Mumbai, Bengaluru (Bangalore), Gurugram, Hyderabad
15 - 25 yrs
₹35L - ₹55L / yr
skill iconAmazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure
Architecture
skill iconPython
+5 more
  • 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
  • 15+ years of experience as a technical specialist in Customer-facing roles.
  • Ability to travel to client locations as needed (25-50%)
  • Extensive experience architecting, designing and programming applications in an AWS Cloud environment
  • Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
  • Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
  • Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
  • Agile software development expert
  • Experience with continuous integration tools (e.g. Jenkins)
  • Hands-on familiarity with CloudFormation
  • Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
  • Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
  • Strong practical application development experience on Linux and Windows-based systems
  • Extra curricula software development passion (e.g. active open source contributor)
Read more
Persistent System Ltd

Persistent System Ltd

Agency job
via Milestone Hr Consultancy by Haina khan
Pune, Bengaluru (Bangalore), Hyderabad
4 - 9 yrs
₹8L - ₹27L / yr
skill iconPython
PySpark
skill iconAmazon Web Services (AWS)
Spark
skill iconScala
Greetings..

We have urgent requirement of Data Engineer/Sr Data Engineer for reputed MNC company.

Exp: 4-9yrs

Location: Pune/Bangalore/Hyderabad

Skills: We need candidate either Python AWS or Pyspark AWS or Spark Scala
Read more
Persistent Systems

at Persistent Systems

1 video
1 recruiter
Agency job
via Milestone Hr Consultancy by Haina khan
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur
4 - 9 yrs
₹4L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Greetings..

We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.

Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs

Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Shivani Mahale
Posted by Shivani Mahale
Pune
15 - 20 yrs
₹25L - ₹50L / yr
Engineering Management
Engineering Manager
Engineering Director
Engineering Head
VP of Engineering
+9 more

As a Director Engineering, your role & responsibility will include the following.  

  • Define the product roadmap and delivery planning. 
  • Provides technical leadership in design, delivery, and support of product software and platforms. 
  • Participate in driving technical architectural decisions for the product.  
  • Prioritization of the features and deliverable artifacts 
  • Augmentation of the product development staff.  
  • Mentor managers to implement best practices to motivate and organize their  teams. 
  • Prepare schedules, report status as well as make hiring decisions. 
  • Ensure to provide proven ability to evaluate and improve software development  best practices. 
  • Provide DevOps and other processes to assure consistency, quality and  timeliness. 
  • Participate in interviewing as well as hiring final decisions. 
  • Guide and provide input to all strategic as well as technical planning for entire  products. 
  • Monitor and provide input for evaluation and prioritize change requests.
  • Create and monitor the set of policies that establish standard development  languages, tools, and methodology; documentation practices; and examination  procedures for developed systems to ensure alignment with overall architecture.' 
  • Participate in project scope, schedule, and cost reviews. 
  • Understand and socialize product capabilities and limitations. 
  • Identify and implement ways to improve and promote quality and demonstrate  accuracy and thoroughness. 
  • Establish working relationships with external technology vendors. 
  • Integrate customer requirements through the engineering effort for championing  next generation products. 
  • Quickly gain an understanding of the company's technology and markets,  establish yourself as a credible leader. 
  • Release scheduling. 
  • Keeps abreast of new technologies and has demonstrated knowledge and  experience in various technologies. 
  • Manage 3rd party consulting partners/vendors implementing products. 
  • Prepare and submit weekly project status reports; prepare monthly reports  outlining team assignments and/or changes, project status changes, and  forecasts project timelines.
  • Provide leadership to individuals or team(s) through coaching, feedback,  development goals, and performance management. 
  • Prioritize employee career development to grow the internal pipeline of leadership  talent. 
  • Prioritize, assign, and manage department activities and projects in accordance  with the department's goals and objectives. Adjust hours of work, priorities, and  staff assignments to ensure efficient operation, based on workload. 

 

Qualification & Experience  

  • Master’s or bachelor’s degree in Computer Science, Business Information  Systems or related field or equivalent work experience required. 
  • Relevant certifications also preferred among other indications of someone who  values continuing education. 
  • 15+ years’ experience "living" with various operating systems, development tools  and development methodologies including Java, data structures, Scala, Python,  NodeJS 
  • 8+ years of individual contributor software development experience.
  • 6+ years management experience in a fast-growing product software  environment with proven ability to lead and engage development, QA and  implementation teams working on multiple projects. 
  • Idea generation and creativity in this position are a must, as are the ability to  work with deadlines, manage and complete projects on time and within budget. 
  • Proven ability to establish and drive processes and procedures with quantifiable  metrics to measure the success and effectiveness of the development  organization. 
  • Proven history of delivering on deadlines/releases without compromising quality. 
  • Mastery of engineering concepts and core technologies: development models,  programming languages, databases, testing, and documentation. 
  • Development experience with compilers, web Services, database engines and  related technologies. 
  • Experience with Agile software development and SCRUM methodologies. 
  • Proven track record of delivering high quality software products. 
  • A solid engineering foundation indicated by a demonstrated understanding of  
  • product design, life cycle, software development practices, and support services.  Understanding of standard engineering processes and software development  methodologies. 
  • Experience coordinating the work and competences of software staff within  functional project groups. 
  • Ability to work cross functionally and as a team with other executive committee  members. 
  • Strong verbal and written communication skills. 
  • Communicate effectively with different business units about technology and  processes using lay terms and descriptions.  
  •  
  • Experience Preferred: 
  • Experience building horizontally scalable solutions leveraging containers,  microservices, Big Data technologies among other related technologies. 
  • Experience working with graphical user experience and user interface design. 
  • Experience working with object-oriented software development, web services,  web development or other similar technical products. 
  • Experience with database engines, languages, and compilers  
  • Experience with user acceptance testing, regression testing and integration  testing. 
  • Experience working on open-source software projects for Apache and other great  open-source software organizations. 
  • Demonstrable experience training and leading teams as a great people leader.
  •  
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Akhil Ravipalli
Posted by Akhil Ravipalli
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Chennai, Pune
2 - 9 yrs
₹15L - ₹60L / yr
Systems design
Data Structures
Algorithms
skill iconJava
skill iconPython
+6 more

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

Top Skills

 

  • You write high quality, maintainable, and robust code, often in Java or C++/C/Python/ROR/C#
  • You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
  • You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

 

  • You solve problems at their root, stepping back to understand the broader context.
  • You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
  • You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
  • You recognize and use design patterns to solve business problems.
  • You understand how operating systems work, perform and scale.
  • You continually align your work with Amazon’s business objectives and seek to deliver business value.
  • You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
  • You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
  • You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

 

  • Bachelors or Masters in Computer Science or relevant technical field.
  • Experience in software development and full product life-cycle.
  • Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
  • Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
  • Proficiency in SQL and data modeling.
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Srilalitha K
Posted by Srilalitha K
Hyderabad, Bengaluru (Bangalore), Delhi, Gurugram, Pune, Chennai
3 - 9 yrs
₹2L - ₹15L / yr
skill iconC
skill iconC++
skill iconC#
skill iconPython
skill icon.NET
+14 more

Software Development Engineer – SDE 2.            

 

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

 Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
Amazon India

at Amazon India

1 video
58 recruiters
Archana J
Posted by Archana J
Bengaluru (Bangalore), Hyderabad, Delhi, Pune, Chennai
2 - 9 yrs
₹10L - ₹15L / yr
skill iconJava
Data Structures
Algorithms
skill iconScala
skill iconC++
+4 more
Hi,

Please find below JD and do reply with updated resume if you are interested.

Software Development Engineer
Bengaluru / Hyderabad / Chennai / Delhi
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

• You write high quality, maintainable, and robust code, often in Java or C++.
• You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
• You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities

• You solve problems at their root, stepping back to understand the broader context.
• You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
• You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
• You recognize and use design patterns to solve business problems.
• You understand how operating systems work, perform and scale.
• You continually align your work with Amazon’s business objectives and seek to deliver business value.
• You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
• You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
• You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

• Bachelors or Masters in Computer Science or relevant technical field.
• Experience in software development and full product life-cycle.
• Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
• Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
• Proficiency in SQL and data modeling.



About Amazon.com

“Many of the problems we face have no textbook solution, and so we-happily-invent new ones.” – Jeff Bezos

Amazon.com – a place where builders can build. We hire the world's brightest minds and offer them an environment in which they can invent and innovate to improve the experience for our customers. A Fortune 100 company based in Seattle, Washington, Amazon is the global leader in e-commerce. Amazon offers everything from books and electronics to apparel and diamond jewelry. We operate sites in Australia, Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Netherlands, Spain, United Kingdom and United States, and maintain dozens of fulfillment centers around the world which encompass more than 26 million square feet.

Technological innovation drives the growth of Amazon, offering our customers more selection, convenient shopping, and low prices. Amazon Web Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, CloudFront and SimpleDB, that powers Amazon.com. Developers can build any type of business on Amazon Web Services and scale their application with growing business needs.

We want you to help share and shape our mission to be Earth's most customer-centric company. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do this every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun.


About Amazon India

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.



We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India.

Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

Thanks and Regards,
Regards,
Archana J
Recruiter (Tech) | Consumer TA
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Nithya Nagarathinam
Posted by Nithya Nagarathinam
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Gurugram, India
3 - 9 yrs
₹1L - ₹15L / yr
skill iconJava
Data Structures
Algorithms
skill iconScala
skill iconC++
+6 more

Role- Software Development Engineer-2

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
Intergral Add Science

Intergral Add Science

Agency job
via Vipsa Talent Solutions by Prashma S R
Pune
5 - 8 yrs
₹9L - ₹25L / yr
skill iconJava
Hadoop
Apache Spark
skill iconScala
skill iconPython
+3 more
  • 6+ years of recent hands-on Java development
  • Developing data pipelines in AWS or Google Cloud
  • Java, Python, JavaScript programming languages
  • Great understanding of designing for performance, scalability, and reliability of data intensive application
  • Hadoop MapReduce, Spark, Pig. Understanding of database fundamentals and advanced SQL knowledge.
  • In-depth understanding of object oriented programming concepts and design patterns
  • Ability to communicate clearly to technical and non-technical audiences, verbally and in writing
  • Understanding of full software development life cycle, agile development and continuous integration
  • Experience in Agile methodologies including Scrum and Kanban
Read more
A leading software client in Pune

A leading software client in Pune

Agency job
via Sapwood Ventures by Sakshi G
Pune
3 - 8 yrs
₹15L - ₹20L / yr
skill iconReact.js
skill iconRedux/Flux
skill iconJava
Data Structures
Algorithms
+7 more
Job Title: Software Developer
Technologies: React JS
Experience: 3 to 8 years
Notice Period: Immediate joiner
Job Location: Kalyani Nager, Pune, MH
Job Summary
Looking for React JS developers who will be responsible for architecting and building applications, as
well as coordinating with the teams responsible for other layers of the product infrastructure.
Responsibilities and Duties:-

Responsible for development of new highly-responsive, web-based user interface

Build pixel-perfect, buttery smooth UIs across both mobile platforms.

Diagnose and fix bugs and performance bottlenecks for performance that feels native.

Reach out to the open source community to encourage and help implement mission-critical
software fixes—React Native moves fast and often breaks things.

Maintain code and write automated tests to ensure the product is of the highest quality.

Transition existing React web apps to React Native.

Construct visualizations that are able to depict vast amounts of data

Work and collaborate with the rest of the engineering team

Work with product team and graphic designers

Develop a flexible and well-structured front-end architecture, along with the APIs to support it
Required Experience, Skills and Qualifications:-
● Experience with automated testing suites, like (Jest or Mocha)
● Experience with JavaScript, REACT, HTML / CSS, REST API's
● Experience with Git knowledge is a plus
● Hands on Redux
●Familiarity with native build tools, like XCode, Gradle (Android Studio, IntelliJ)
●Understanding of REST APIs, the document request model, and offline storage
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Pune, Bengaluru (Bangalore), Hyderabad, Chennai
3 - 8 yrs
₹2L - ₹9L / yr
Data engineering
Data engineer
Spark
Apache Spark
Apache Kafka
+13 more

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Neeta Singh Mehta
Posted by Neeta Singh Mehta
Bengaluru (Bangalore), Hyderabad, Noida, gurugram, Mumbai, Pune, NCR (Delhi | Gurgaon | Noida)
3 - 8 yrs
₹6L - ₹32L / yr
skill iconJava
skill iconPython
skill iconRuby
skill iconPHP
skill iconC++
+9 more

Greetings from Amazon...!                                       

 

It is our pleasure in personally inviting you to apply job with Amazon Development Centre India (ADCI). At Amazon we are inclined to hire people with passion for technology and you happened to be one of the shortlisted candidates. Our business is committed to recognizing potential and creating teams that embrace innovation.

 

Please find the Eligible criteria and requirements:

 

Job title                             :             SDE – II (Software Development Engineer)
Role Opportunity
            :             Permanent/Full Time/FTE/Regular

Work Location                 :             Hyderabad/Bangalore/ Gurgaon

 

Must Have

  • Strong Exposure to Data Structures, Algorithms, Coding, System Design (LLD, HLD, OOAD), Distributed systems, problem solving skills, Architecture (MVC/Microservices), logical thinking.

Amazon (ADCI) - If you are looking for an opportunity to solve deep technical problems and build innovative solutions in a fast paced environment working with smart, passionate software developers, this might be the role for you. Amazon’s transportation systems get millions of packages to customers worldwide faster and cheaper while providing world class customer experience – from checkout to shipment tracking to delivery. Our software systems include services that handle thousands or requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve experience for millions of online shoppers. With rapid expansion into new geographies, innovations in supply chain, delivery models and customer experience, increasingly complex transportation network, ever expanding selection of products and growing number of shipments worldwide, we have an opportunity to build software that scales the business, leads the industry through innovation and delights millions of customers worldwide.

 

As an SDE, you will develop a deep understanding of our business, work closely with development teams and own the architecture and end-to-end delivery of software components.

About Amazon India:

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched http://www.amazon.in">www.amazon.in for shoppers in India. With http://www.amazon.in">www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.

We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India. Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

 

Basic Qualifications:

 

  • 3+ years’ experience building successful production software systems
  • A solid grounding in Computer Science fundamentals (based on a BS or MS in CS or related field)
  • The ability to take convert raw requirements into good design while exploring technical feasibility tradeoffs
  • Expertise in System design (design patterns, LLD, HLD, Solid principle, OOAD, Distributed systems etc..), Architecture (MVC/Micro services).
  • Good understanding of at least some of the modern programming languages (Java) and open-source technologies (C++, Python, Scala, C#, PHP, Ruby etc..)
  • Excellence in technical communication
  • Has experience in mentoring other software developers

 

Preferred Qualifications:

 

  • BS/MS in Computer Science or equivalent
  • Experience developing service oriented architectures and an understanding of design for scalability, performance and reliability
  • Demonstrated ability to mentor other software developers to maintain architectural vision and software quality
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment
  • Expertise in delivering high-quality, innovative application
  • Strong desire to build, sense of ownership, urgency, and drive
  • Strong organizational and problem solving skills with great attention to detail
  • Ability to triage issues, react well to changes, work with teams and ability to multi-task on multiple products and projects.
  • Experience building highly scalable, high availability services
  • The ideal candidate will be a visionary leader, builder and operator.
  • He/she should have experience leading or contributing to multiple simultaneous product development efforts and initiatives.
  • He/she needs to balance technical leadership with strong business judgment to make the right decisions about technology choices.
  • He/she needs to be constantly striving for simplicity, and at the same time demonstrate significant creativity, innovation and judgment
  • Proficiency in, at least, one modern programming language.
  • Experience in SQL or Non-SQL database.
  • Strong sense of ownership, urgency, and drive.
  • Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices.
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
  • Excellent communication, collaboration, reporting, analytical and problem solving skills.

 

 

Good to Have:

  • Knowledge of professional software engineering practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience with enterprise-wide systems
  • Experience influencing software engineers best practices within your team
  • Hands-on expertise in many disparate technologies, typically ranging from front-end user interfaces through to back-end systems and all points in between
  • Strong written and verbal communication skills preferred

 

Key Points to remember:

 

  • Strong knowledge of the Software Development Life Cycle methodology
  • Technical design, development and implementation decisions on the use of technology in area(s) of specialization.
  • Write or modify programming code to suit customer's needs.
  • Unit test to assure meets requirements, including integration test as needed.
  • Ability to understand and analyze issues and uses judgment to make decisions.
  • Strong problem solving & troubleshooting skills
  • Strong communication skills
  • Responsible for self-development according to professional development plan

 

 

 

 

 

 

Read more
DataMetica

at DataMetica

1 video
7 recruiters
Sumangali Desai
Posted by Sumangali Desai
Pune, Hyderabad
7 - 12 yrs
₹7L - ₹20L / yr
Apache Spark
Big Data
Spark
skill iconScala
Hadoop
+3 more
We at Datametica Solutions Private Limited are looking for Big Data Spark Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description
Experience : 7+ years
Location : Pune / Hyderabad
Skills :
  • Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings
  • Participate and contribute in Solution Design and Solution Architecture for implementing Big Data Projects on-premise and on cloud
  • Technical Hands on experience in design, coding, development and managing Large Hadoop implementation
  • Proficient in SQL, Hive, PIG, Spark SQL, Shell Scripting, Kafka, Flume, Scoop with large Big Data and Data Warehousing projects with either Java, Python or Scala based Hadoop programming background
  • Proficient with various development methodologies like waterfall, agile/scrum and iterative
  • Good Interpersonal skills and excellent communication skills for US and UK based clients

About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.


We have our own products!
Eagle –
Data warehouse Assessment & Migration Planning Product
Raven –
Automated Workload Conversion Product
Pelican -
Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy

Check out more about us on our website below!
www.datametica.com
Read more
Service based company

Service based company

Agency job
via Tech - Soul Technologies by Rohini Shinde
Pune
6 - 12 yrs
₹6L - ₹28L / yr
Big Data
Apache Kafka
Data engineering
Cassandra
skill iconJava
+1 more

Primary responsibilities:

  • Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
  • Designing systems with Solr, Akka, Cassandra, Kafka
  • Algorithmic development with primary focus Machine Learning
  • Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
  • Participation in design and code reviews and recommend improvements
  • Unit testing with JUnit, Performance testing and tuning
  • Coordination with internal and external teams
  • Mentoring junior engineers
  • Participate in Product roadmap and Prioritization discussions and decisions
  • Evangelize the solution with Professional services and Customer Success teams

 

Read more
Springer Nature

at Springer Nature

2 recruiters
Anagha Malvade
Posted by Anagha Malvade
Pune
4.5 - 10 yrs
₹5L - ₹20L / yr
skill iconScala
skill iconJava
skill iconJavascript
Akka
Play Framework

We are on a quest to find a Senior Software Developer - Scala with several years of experience who will help us in extending and maintaining our platform

 You will join a cross functional team with different nationalities, backgrounds and experience levels. The agile team is co-located including the Product Managers

You will collaborate with all team members in order to deliver the best solutions that enhance our platform

About Springer Nature India Pvt. Ltd:

 Springer Nature opens the doors to discovery for researchers, educators, clinicians and other professionals. Every day, around the globe, our imprints, books, journals, platforms and technology solutions reach millions of people. For over 175 years our brands and imprints have been a trusted source of knowledge to these communities and today, more than ever, we see it as our responsibility to ensure that fundamental knowledge can be found, verified, understood and used by our communities – enabling them to improve outcomes, make progress, and benefit the generations that follow.

Visit: http://group.springernature.com">group.springernature.com and follow @SpringerNature

 

If you are still wondering, why should you work with us. Here are 5 reasons why?

  1. Springer Nature is one of the world's largest publishing company. Nobel laureates publish their research at Springer.
  2. We are truly a digital organization and Springer Nature Pune is at the helm of this digitization.
  3. We not only believe but preach providing good work life balance to our employees.
  4. We are investing in building our products using machine learning and NLP.
  5. We work in latest technologies like AWS and Scala.

About the team:

Backend - Adis – PV is a scientific analysis platform being built for extracting content from scientific articles and providing meaningful insights

Insights are then published in a structured way, so that they can be made accessible to end users, via feeds delivery or through the platform

Backend - Adis – PV will be a production system for all databases under one IT landscape and under one umbrella.

Job Type: Permanent    
                                   

Job Location: Magarpatta City, Pune - India  (Work from home until further notice)

Years of Experience 6 to 10 years

What we are looking for

Educational Qualification:

B.Sc., BCA, BCS, B.E., B.Tech, M.Tech, MCA and M.Sc.

Skill Matrix:

Primary Language Skills: Java 8, Scala

Framework: Play Framework

Messaging: Rabbit MQ

Ideologies: TDD / ATDD, Pair Programming

Database: SQL and NoSQL

Challenges

  • You will help us continuously improving our platform
  • Together we will create best in class services that support the needs of our customers
  • Taking part of team ceremonies like grooming, planning and retrospective
  • Develop new features
  • Improve code quality by doing pair programming or code reviews
  • Continuously improve and monitor our product

Key Responsibilities

  • Own and consistently deliver high quality end-to-end product features keeping in view technical & business expectations
  • Add meaty features to the product which will deliver substantial business value
  • Pioneer clean coding and continuous code refactoring
  • Understands and appreciates existing design and architecture of the product
  • Understands pros and cons of various technology options available
  • Takes technical ownership of some of sub-systems of the product
  • Makes changes in the product designs to achieve required business value / mileage
  • Identify and addresses technical debts
  • Understands technical vision and road-map of the product and expectations
  • Understands purview of key pieces of deliverables and own few of these pieces

Day at work

  • Pioneer proof of concepts of new technologies keeping in view product road-map and business priorities
  • Self-study and share his / her learning within the team and across teams
  • Provide required help to other team-members
  • Pioneer in various team events and work towards objectives of these events
  • Make meaningful suggestions to make ceremonies more effective

About You

  • You have several years of experience with Software Development
  • You have worked successfully with product teams in the past and ideally have some experience with mentoring junior developers
  • You like working in a collaborative environment where there is collective ownership of the code
  • You work in a Continuous Integration and always strive for Continuous Delivery
  • You like to share and enable others to increase your whole team's performance
Read more
Fast paced Startup

Fast paced Startup

Agency job
via Kavayah People Consulting by Kavita Singh
Pune
3 - 6 yrs
₹15L - ₹22L / yr
Big Data
Data engineering
Hadoop
Spark
Apache Hive
+6 more

ears of Exp: 3-6+ Years 
Skills: Scala, Python, Hive, Airflow, Spark

Languages: Java, Python, Shell Scripting

GCP: BigTable, DataProc,  BigQuery, GCS, Pubsub

OR
AWS: Athena, Glue, EMR, S3, Redshift

MongoDB, MySQL, Kafka

Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time 

Read more
Persistent Systems

at Persistent Systems

1 video
1 recruiter
Agency job
via Milestone Hr Consultancy by Haina khan
Bengaluru (Bangalore), Hyderabad, Pune
9 - 16 yrs
₹7L - ₹32L / yr
Big Data
skill iconScala
Spark
Hadoop
skill iconPython
+1 more
Greetings..
 
We have urgent requirement for the post of Big Data Architect in reputed MNC company
 
 


Location:  Pune/Nagpur,Goa,Hyderabad/Bangalore

Job Requirements:

  • 9 years and above of total experience preferably in bigdata space.
  • Creating spark applications using Scala to process data.
  • Experience in scheduling and troubleshooting/debugging Spark jobs in steps.
  • Experience in spark job performance tuning and optimizations.
  • Should have experience in processing data using Kafka/Pyhton.
  • Individual should have experience and understanding in configuring Kafka topics to optimize the performance.
  • Should be proficient in writing SQL queries to process data in Data Warehouse.
  • Hands on experience in working with Linux commands to troubleshoot/debug issues and creating shell scripts to automate tasks.
  • Experience on AWS services like EMR.
Read more
Finaxar Technology Solutions Private limited
Remote, Pune
4 - 9 yrs
₹5L - ₹25L / yr
skill iconScala
ORM
skill iconAmazon Web Services (AWS)
JIRA Agile
Play Framework

Roles and responsibilities

    • Develop well-designed, performant and scalable microservices
    • Write reusable, testable, and efficient code that follow software development best practices
    • Integrate data storage solutions including databases, key-value stores, blob stores, etc.
    • Expose business functionality to frontend/mobile applications and partner systems through secure and scalable APIs.
    • Build integrations with 3rd party applications through apis’ to ingest and process data
    • Ensure security and data protection aspects within the applications
    • Contribute to devops by building CD/CI pipelines to automate releases
    • Ensure high performance and availability of distributed systems and applications
    • Interact directly with client project team members and operational staff to support live customer deployments and production issues.
Requirements
  • 4+ years of experience in developing applications using Scala and related technologies.
  • Thorough understanding of multithreading concepts and async execution using Actor model.
  • Thorough understanding of Play framework, GraphQL and GRPC technologies.
  • Experience in using DAL and ORM (Object Relational Mapper) libraries for data access.
  • Experiencing in developing and hosting APIs and integration with external applications.
  • Experience in building data models and repositories using relational and NoSql databases.
  • Knowledge of JIRA, Bitbucket and agile methodologies.
  • Good to have knowledge of AWS services like Lambda, dynamodb, kinesis and others.
  • Understanding of fundamental design principles behind a scalable application.
  • Familiarity with event-driven programming and distributed architectures.
  • Strong unit test and debugging skills
  • Affinity for learning and applying new technologies and solving new problems
  • Effective organizational skills with strong attention to detail
  • Experience in working with docker is a plus
  • Comfortable in working with Unix/Linux environment
  • Strong communication skills — both written and verbal
Read more
DataMetica

at DataMetica

1 video
7 recruiters
Nikita Aher
Posted by Nikita Aher
Pune
1.5 - 2.5 yrs
₹1L - ₹10L / yr
skill iconJava
skill iconScala
skill iconPython
Maven
Oracle
+2 more
Mandatory Skills:
  • As a polyglot developer Ideally, you should have:
  • 1.5+ years of development experience using any of technology java, scala, python or any similar exciting technologies.
  • Hands-on experience in coding, and implementation of complex, custom-built applications Working knowledge of build tool like maven/sbt and code versioning systems like git/bitbucket/cvs/svn
  • Familiarity with few databases, like MySQL, Oracle, PostgreSQL, SQL Server, NoSQL etc Great OO skills, including strong design patterns knowledge
  • Good communication and ability to work in a consulting environment is essential
Good to Have:
  • Think through hard problems in a consultancy environment, and work with amazing people to make the solutions a reality Work in a dynamic, collaborative, non-hierarchical environment where your talent is valued over your job title or years of experience
  • Build custom software using the latest technologies and tools Craft your own career path
You'll be responsible for:
  • Providing solution to real problems in Bigdata world.
  • RnD on using the latest tools,techniques and cloud services.
  • Automating the manual-timetaking tasks.
  • Hands-on coding, usually in a pair programming environment.
  • Working in highly collaborative teams and building quality code
  • Working in lots of different domains and client environments
  • Understanding the business domain deeply
What we do:
We are a team of technology agnostic, passionate people who aim to provide solution to real world Bigdata problems.
We are building solutions that will help our customers to do automatic migration of their RDBMS systems to latest BIGDATA platforms and tools such as Spark, Apex, Flink etc. For more information do visit our products webpage.  
Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Leap Info Systems

at Leap Info Systems

3 recruiters
Sudhir Patil
Posted by Sudhir Patil
Pune
0 - 2 yrs
₹2L - ₹4L / yr
skill iconJava
J2EE
Data Structures
Algorithms
skill iconScala
+4 more

About the Company

 

Leap Info Systems Pvt. Ltd. - A software product company with its products and solutions in the convergent lighting controls and automation. Recently Leap acquired elitedali – worlds first Niagara based lighting controls and automation solution, from one of the leading US organization.

 

We are a passionate team on a mission to develop innovative controls and automation products. We are expanding our product development team and are in search of like-minded highly passionate team members who would like to contribute to the leading lighting controls and automation product. We have customers in India, Europe, USA, and Australia.

 

Eligibility

 

Any suitable graduates (0 to 2 years experienced) who meet prescribed qualities and job responsibilities. Preferred but not limited to engineering graduate in Computer/IT/Instrumentation/EnTC etc.
Job Responsibilities

 

Be a part of the product development team for maintaining as well as developing new products based on JAVA and Niagara Software framework.

 

Qualities

 

  • Self-Learner
  • Analytical skills
  • Able to work with Cross-functional team
  • Problem-solving approach
  • Good Team Player with Positive vibes

 

What you must have,

  • Hands-on programming at academic or at the hobby level.
  • Good knowledge of Java/J2EE related technologies
  • Good knowledge of solution-based approaches
  • Good communication skills - phone, email and in-person.

 

 

What you can expect from LEAP,

  • Positive conducive working environment to grow with cross-functional team members.
  • Flexi working hours with defined responsibilities.
  • Opportunity to work with leading open standard automation framework like Niagara Software, Lighting controls technologies like DALI, Wireless etc
  • Be the active part of an emerging global product company
Read more
International design & Engineering solutions pvt ltd
Pune
2 - 6 yrs
₹4L - ₹12L / yr
skill iconC++
skill iconC#
Algorithms
skill iconJava
Data Structures
+3 more
Work with developers to design algorithms and flowcharts Prepare GUI dummy screens for proposed Software development using Excel VBA. (To Give a overview how the software buttons and flow of information should happen) Coordination with Software Developer team to explain the criteria Produce clean, efficient code based on specifications Integrate software components and third-party programs Verify and deploy programs and systems Troubleshoot, debug and upgrade existing software Gather and evaluate user feedback Recommend and execute improvements Create technical documentation for reference and reporting Proven experience as a Software Developer, Software Engineer or similar role Familiarity with development methodologies Experience with software design and development in a test-driven environment Knowledge of coding languages (e.g. C#, C++) and frameworks/systems Ability to learn new languages and technologies Excellent communication skills Resourcefulness and troubleshooting aptitude Attention to detail Sound technical knowledge, thorough knowledge of all related codes and section details is desired. Thorough Knowledge of Design of Components of Residential / Commercial Structures is Desired. Accuracy In Following The Process & Jobs Is Required. Experience in Interaction with International Client Will Be Preferred
Read more
Saama Technologies

at Saama Technologies

6 recruiters
Sandeep Chaudhary
Posted by Sandeep Chaudhary
Pune
2 - 5 yrs
₹1L - ₹18L / yr
Hadoop
Spark
Apache Hive
Apache Flume
skill iconJava
+5 more
Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation
Read more
InfoVision Labs India Pvt. Ltd. Pune
Shekhar Singh kshatri
Posted by Shekhar Singh kshatri
Pune
5 - 10 yrs
₹5L - ₹5L / yr
Hadoop
skill iconScala
Spark
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Read more
crest

at crest

1 recruiter
Kunal Kishore
Posted by Kunal Kishore
Pune
2 - 7 yrs
₹5L - ₹17L / yr
Test Driven Development TDD)
Continuous Integration Tool
skill iconGit
RESTful APIs
skill iconScala
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort