Scala Jobs in Pune

Explore top Scala Job opportunities in Pune from Top Companies & Startups. All jobs are added by verified employees who can be contacted directly below.
icon

consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry

Agency job
via Jobdost by Sathish Kumar
icon
Ahmedabad, Hyderabad, Pune, Delhi
icon
5 - 7 yrs
icon
₹18L - ₹25L / yr
AWS Lambda
AWS Simple Notification Service (SNS)
AWS Simple Queuing Service (SQS)
Python
PySpark
+9 more
  1. Data Engineer

 Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON

Mandatory Requirements  

  • Experience in AWS Glue
  • Experience in Apache Parquet 
  • Proficient in AWS S3 and data lake 
  • Knowledge of Snowflake
  • Understanding of file-based ingestion best practices.
  • Scripting language - Python & pyspark 

CORE RESPONSIBILITIES 

  • Create and manage cloud resources in AWS 
  • Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies 
  • Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform 
  • Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations 
  • Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
  • Define process improvement opportunities to optimize data collection, insights and displays.
  • Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible 
  • Identify and interpret trends and patterns from complex data sets 
  • Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders. 
  • Key participant in regular Scrum ceremonies with the agile teams  
  • Proficient at developing queries, writing reports and presenting findings 
  • Mentor junior members and bring best industry practices 

QUALIFICATIONS 

  • 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales) 
  • Strong background in math, statistics, computer science, data science or related discipline
  • Advanced knowledge one of language: Java, Scala, Python, C# 
  • Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake  
  • Proficient with
  • Data mining/programming tools (e.g. SAS, SQL, R, Python)
  • Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
  • Data visualization (e.g. Tableau, Looker, MicroStrategy)
  • Comfortable learning about and deploying new technologies and tools. 
  • Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines. 
  • Good written and oral communication skills and ability to present results to non-technical audiences 
  • Knowledge of business intelligence and analytical tools, technologies and techniques.

  

Familiarity and experience in the following is a plus:  

  • AWS certification
  • Spark Streaming 
  • Kafka Streaming / Kafka Connect 
  • ELK Stack 
  • Cassandra / MongoDB 
  • CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
Read more
icon
Pune
icon
12 - 16 yrs
icon
₹22L - ₹35L / yr
Java
NodeJS (Node.js)
Scala
Javascript
MVC Framework
+5 more

Qualification : BE / BTech / MCA/ ME / MTech


Required Skils

● Strong experience in architecture of distributed cloud based systems using Java, Scala, Angular, Node Js like technologies. 

● Strong understanding of large scale distributed architectures, microservices architecture, reactive programming paradigms, design patterns, information architecture, application development processes and practices

 

● Knowledge of learning domain is added advantage

● Experience in working with Open Source software is desirable

● Must be very good in java, javascript related technologies

● Experience on one or more Javascript frameworks such as JQuery, Twitter Bootstrap, backbone, Angular and others.

● Should have strong understanding of transactional databases, and of multiple types of NoSQL databases like Cassandra, Elastic Search, Neo4j, and others

● Understanding of AKKA, Play framework

● Should have good DevOps working knowledge on Cloud infrastructure

● Experience in TDD/BDD is required

● Knowledge of Kafka, Azure/Google Cloud is added advantage

Good understanding of Software as a Service model preferred.

Read more
DP
Posted by Nelson Xavier
icon
Bengaluru (Bangalore), Pune, Hyderabad
icon
4 - 8 yrs
icon
₹10L - ₹25L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more

Job responsibilities

- You will partner with teammates to create complex data processing pipelines in order to solve our clients' most complex challenges

- You will pair to write clean and iterative code based on TDD

- Leverage various continuous delivery practices to deploy, support and operate data pipelines

- Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available

- Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions

- Create data models and speak to the tradeoffs of different modeling approaches

- Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process

- Encouraging open communication and advocating for shared outcomes

 

Technical skills

- You have a good understanding of data modelling and experience with data engineering tools and platforms such as Spark (Scala) and Hadoop

- You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting

- Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions

- You are comfortable taking data-driven approaches and applying data security strategy to solve business problems

- Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems

- You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments

 



Professional skills

- You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives

- An interest in coaching, sharing your experience and knowledge with teammates

- You enjoy influencing others and always advocate for technical excellence while being open to change when needed

- Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more

Read more
icon
Pune, Chennai
icon
5 - 9 yrs
icon
₹15L - ₹20L / yr
Scala
PySpark
Spark
SQL Azure
Hadoop
+4 more
  • 5+ years of experience in a Data Engineering role on cloud environment
  • Must have good experience in Scala/PySpark (preferably on data-bricks environment)
  • Extensive experience with Transact-SQL.
  • Experience in Data-bricks/Spark.
  • Strong experience in Dataware house projects
  • Expertise in database development projects with ETL processes.
  • Manage and maintain data engineering pipelines
  • Develop batch processing, streaming and integration solutions
  • Experienced in building and operationalizing large-scale enterprise data solutions and applications
  • Using one or more of Azure data and analytics services in combination with custom solutions
  • Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Cloud repositories for e.g. Azure GitHub, Git
  • Experience in an agile environment (Prefer Azure DevOps).

Good to have

  • Manage source data access security
  • Automate Azure Data Factory pipelines
  • Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
  • Experience in implementing and maintaining CICD pipelines
  • Power BI understanding, Delta Lake house architecture
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
  • Expertise in database development projects and ETL processes.
Read more
DP
Posted by phani kalyan
icon
Pune
icon
9 - 14 yrs
icon
₹20L - ₹40L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Job Id: SG0601

Hi,

Enterprise Minds is looking for Data Architect for Pune Location.

Req Skills:
Python,Pyspark,Hadoop,Java,Scala
Read more

Tier 1 MNC

Agency job
icon
Chennai, Pune, Bengaluru (Bangalore), Noida, Gurugram, Kochi (Cochin), Coimbatore, Hyderabad, Mumbai, Navi Mumbai
icon
3 - 12 yrs
icon
₹3L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more
Greetings,
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
Read more

This company provides on-demand cloud computing platforms.

Agency job
via New Era India by Niharica Singh
icon
Remote, Pune, Mumbai, Bengaluru (Bangalore), Gurugram, Hyderabad
icon
15 - 25 yrs
icon
₹35L - ₹55L / yr
Amazon Web Services (AWS)
Google Cloud Platform (GCP)
Windows Azure
Architecture
Python
+5 more
  • 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
  • 15+ years of experience as a technical specialist in Customer-facing roles.
  • Ability to travel to client locations as needed (25-50%)
  • Extensive experience architecting, designing and programming applications in an AWS Cloud environment
  • Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
  • Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
  • Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
  • Agile software development expert
  • Experience with continuous integration tools (e.g. Jenkins)
  • Hands-on familiarity with CloudFormation
  • Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
  • Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
  • Strong practical application development experience on Linux and Windows-based systems
  • Extra curricula software development passion (e.g. active open source contributor)
Read more
icon
Remote, Bengaluru (Bangalore), Pune, Chennai
icon
3 - 25 yrs
icon
₹12L - ₹80L / yr
Java
Javascript
React.js
Angular (2+)
AngularJS (1.x)
+5 more

Job Description

 

A top of the line, premium software advisory & development services firm. Our customers include promising early stage start ups, fortune 500 enterprises and investors. We draw inspiration from Leonardo Da Vinci's famous quote - Simplicity is the ultimate sophistication.

Domains we work in

Multiple; publishing, retail, banking, networking, social sector, education and many more.

Tech we use

Java, Scala, Golang, Elixir, Python, RoR, .Net, JS frameworks

More details on tech

You name it and we might be working on it. The important thing is not technology here but what kind of solutions we provide to our clients. We believe to solve some of the most complex problems, holistic thinking and solution design is of extreme importance. Technology is the most important tool to implement the solution thus designed.

Skills & Requirements

Who should join us

We are looking for curious & inquisitive technology practitioners. Our customers see us one of the most premium advisory and development services firm, hence most of the problems we work on are complex and often hard to solve. You can expect to work in small (2-5) people teams, working very closely with the customers in iterative developing and evolving the solution. We are continually on the search for passionate, bright and energetic professionals to join our team.

So, if you are someone who has strong fundamentals on technology and wants to stretch, beyond the regular role based boundaries, then Sahaj is the place for you. You will experience a world, where there are no roles or grades and you will play different roles and wear multiple hats, to deliver a software project.

What would you do here

* Work on complex, custom-designed, scalable, multi-tiered software development projects

* Work closely with clients (commercial & social enterprises, start ups), both Business and Technical staff members * Be responsible for the quality of software and resolving any issues regards the solution

* Think through hard problems, not limited to technology and work with a team to realise and implement solutions

* Learn something new everyday

Below are key skills expected

* Development and delivery experience in any of the programming languages

* Passion for software engineering and craftsman-like coding prowess

* Great design and solutioning skills (OO & Functional)

* Experience including analysis, design, coding and implementation of large scale custom built object-oriented applications

* Understanding of code refactoring and optimisation issues

* Understanding of Virtualisation & DevOps. Experience with Ansible, Chef, Docker preferable * Ability to learn new technologies and adapt to different situations

* Ability to handle ambiguity on a day to day basis

Read more
icon
Pune, Bengaluru (Bangalore), Coimbatore, Hyderabad, Gurugram
icon
3 - 10 yrs
icon
₹18L - ₹40L / yr
Apache Kafka
Spark
Hadoop
Apache Hive
Big Data
+5 more

Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions, or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.



You’ll spend time on the following:

  • You will partner with teammates to create complex data processing pipelines in order to solve our clients’ most ambitious challenges
  • You will collaborate with Data Scientists in order to design scalable implementations of their models
  • You will pair to write clean and iterative code based on TDD
  • Leverage various continuous delivery practices to deploy data pipelines
  • Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
  • Develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches

Here’s what we’re looking for:

 

  • You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
  • You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
  • Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
  • You are comfortable taking data-driven approaches and applying data security strategy to solve business problems 
  • Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
  • Strong communication and client-facing skills with the ability to work in a consulting environment
Read more

Persistent System Ltd

Agency job
via Milestone Hr Consultancy by Haina khan
icon
Pune, Bengaluru (Bangalore), Hyderabad
icon
4 - 9 yrs
icon
₹8L - ₹27L / yr
Python
PySpark
Amazon Web Services (AWS)
Spark
Scala
Greetings..

We have urgent requirement of Data Engineer/Sr Data Engineer for reputed MNC company.

Exp: 4-9yrs

Location: Pune/Bangalore/Hyderabad

Skills: We need candidate either Python AWS or Pyspark AWS or Spark Scala
Read more
icon
Pune, Bengaluru (Bangalore), Hyderabad, Nagpur
icon
4 - 9 yrs
icon
₹4L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+3 more
Greetings..

We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.

Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs

Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws
Read more
icon
Mumbai, Pune
icon
8 - 14 yrs
icon
₹10L - ₹15L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more
Job Responsibilities
1. Understand the business problem and translate these to data services and
engineering outcomes.
2. Expertise in working on cloud application designs, cloud approval plans, and
systems required to manage cloud storage.
3. Explore new technologies and learn new techniques to solve business problems
creatively
4. Collaborate with different teams - engineering and business, to build better data
products
5. Regularly evaluate cloud applications, hardware, and software.
6. Respond to technical issues in a professional and timely manner.
7. Identify the top cloud architecture solutions to successfully meet the strategic
needs of the company.
8. Offer guidance in infrastructure movement techniques including bulk application
transfers into the cloud.
9. Manage team and handle delivery of 2-3 projects
JD | Data Architect 24-Aug-2021
Solving for better 3 of 7

Qualifications
Is Education overrated? Yes. We believe so. But there is no way to locate you
otherwise. So we might look for at least a computer science, computer engineering,
information technology, or relevant field along with:

1. Over 4-6 years of experience in Data handling
2. Hands-on experience of any one programming language (Python, Java, Scala)
3. Understanding of SQL is must
4. Big data (Hadoop, Hive, Yarn, Sqoop)
5. MPP platforms (Spark, Presto)
6. Data-pipeline & scheduler tool (Ozzie, Airflow, Nifi)
7. Streaming engines (Kafka, Storm, Spark Streaming)
8. Any Relational database or DW experience
9. Any ETL tool experience
10. Hands-on experience in pipeline design, ETL and application development
11. Hands-on experience in cloud platforms like AWS, GCP etc.
12. Good communication skills and strong analytical skills
13. Experience in team handling and project delivery
Read more
icon
Pune
icon
15 - 20 yrs
icon
₹25L - ₹50L / yr
Engineering Management
Engineering Manager
Engineering Director
Engineering Head
VP of Engineering
+9 more

As a Director Engineering, your role & responsibility will include the following.  

  • Define the product roadmap and delivery planning. 
  • Provides technical leadership in design, delivery, and support of product software and platforms. 
  • Participate in driving technical architectural decisions for the product.  
  • Prioritization of the features and deliverable artifacts 
  • Augmentation of the product development staff.  
  • Mentor managers to implement best practices to motivate and organize their  teams. 
  • Prepare schedules, report status as well as make hiring decisions. 
  • Ensure to provide proven ability to evaluate and improve software development  best practices. 
  • Provide DevOps and other processes to assure consistency, quality and  timeliness. 
  • Participate in interviewing as well as hiring final decisions. 
  • Guide and provide input to all strategic as well as technical planning for entire  products. 
  • Monitor and provide input for evaluation and prioritize change requests.
  • Create and monitor the set of policies that establish standard development  languages, tools, and methodology; documentation practices; and examination  procedures for developed systems to ensure alignment with overall architecture.' 
  • Participate in project scope, schedule, and cost reviews. 
  • Understand and socialize product capabilities and limitations. 
  • Identify and implement ways to improve and promote quality and demonstrate  accuracy and thoroughness. 
  • Establish working relationships with external technology vendors. 
  • Integrate customer requirements through the engineering effort for championing  next generation products. 
  • Quickly gain an understanding of the company's technology and markets,  establish yourself as a credible leader. 
  • Release scheduling. 
  • Keeps abreast of new technologies and has demonstrated knowledge and  experience in various technologies. 
  • Manage 3rd party consulting partners/vendors implementing products. 
  • Prepare and submit weekly project status reports; prepare monthly reports  outlining team assignments and/or changes, project status changes, and  forecasts project timelines.
  • Provide leadership to individuals or team(s) through coaching, feedback,  development goals, and performance management. 
  • Prioritize employee career development to grow the internal pipeline of leadership  talent. 
  • Prioritize, assign, and manage department activities and projects in accordance  with the department's goals and objectives. Adjust hours of work, priorities, and  staff assignments to ensure efficient operation, based on workload. 

 

Qualification & Experience  

  • Master’s or bachelor’s degree in Computer Science, Business Information  Systems or related field or equivalent work experience required. 
  • Relevant certifications also preferred among other indications of someone who  values continuing education. 
  • 15+ years’ experience "living" with various operating systems, development tools  and development methodologies including Java, data structures, Scala, Python,  NodeJS 
  • 8+ years of individual contributor software development experience.
  • 6+ years management experience in a fast-growing product software  environment with proven ability to lead and engage development, QA and  implementation teams working on multiple projects. 
  • Idea generation and creativity in this position are a must, as are the ability to  work with deadlines, manage and complete projects on time and within budget. 
  • Proven ability to establish and drive processes and procedures with quantifiable  metrics to measure the success and effectiveness of the development  organization. 
  • Proven history of delivering on deadlines/releases without compromising quality. 
  • Mastery of engineering concepts and core technologies: development models,  programming languages, databases, testing, and documentation. 
  • Development experience with compilers, web Services, database engines and  related technologies. 
  • Experience with Agile software development and SCRUM methodologies. 
  • Proven track record of delivering high quality software products. 
  • A solid engineering foundation indicated by a demonstrated understanding of  
  • product design, life cycle, software development practices, and support services.  Understanding of standard engineering processes and software development  methodologies. 
  • Experience coordinating the work and competences of software staff within  functional project groups. 
  • Ability to work cross functionally and as a team with other executive committee  members. 
  • Strong verbal and written communication skills. 
  • Communicate effectively with different business units about technology and  processes using lay terms and descriptions.  
  •  
  • Experience Preferred: 
  • Experience building horizontally scalable solutions leveraging containers,  microservices, Big Data technologies among other related technologies. 
  • Experience working with graphical user experience and user interface design. 
  • Experience working with object-oriented software development, web services,  web development or other similar technical products. 
  • Experience with database engines, languages, and compilers  
  • Experience with user acceptance testing, regression testing and integration  testing. 
  • Experience working on open-source software projects for Apache and other great  open-source software organizations. 
  • Demonstrable experience training and leading teams as a great people leader.
  •  
Read more
DP
Posted by Akhil Ravipalli
icon
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Chennai, Pune
icon
2 - 9 yrs
icon
₹15L - ₹60L / yr
Systems design
Data Structures
Algorithms
Java
Python
+6 more

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

Top Skills

 

  • You write high quality, maintainable, and robust code, often in Java or C++/C/Python/ROR/C#
  • You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
  • You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

 

  • You solve problems at their root, stepping back to understand the broader context.
  • You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
  • You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
  • You recognize and use design patterns to solve business problems.
  • You understand how operating systems work, perform and scale.
  • You continually align your work with Amazon’s business objectives and seek to deliver business value.
  • You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
  • You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
  • You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

 

  • Bachelors or Masters in Computer Science or relevant technical field.
  • Experience in software development and full product life-cycle.
  • Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
  • Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
  • Proficiency in SQL and data modeling.
Read more
icon
Hyderabad, Bengaluru (Bangalore), Delhi, Gurugram, Pune, Chennai
icon
3 - 9 yrs
icon
₹2L - ₹15L / yr
C
C++
C#
Python
.NET
+14 more

Software Development Engineer – SDE 2.            

 

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

 Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
DP
Posted by Archana J
icon
Bengaluru (Bangalore), Hyderabad, Delhi, Pune, Chennai
icon
2 - 9 yrs
icon
₹10L - ₹15L / yr
Java
Data Structures
Algorithms
Scala
C++
+4 more
Hi,

Please find below JD and do reply with updated resume if you are interested.

Software Development Engineer
Bengaluru / Hyderabad / Chennai / Delhi
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

• You write high quality, maintainable, and robust code, often in Java or C++.
• You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
• You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities

• You solve problems at their root, stepping back to understand the broader context.
• You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
• You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
• You recognize and use design patterns to solve business problems.
• You understand how operating systems work, perform and scale.
• You continually align your work with Amazon’s business objectives and seek to deliver business value.
• You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
• You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
• You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

• Bachelors or Masters in Computer Science or relevant technical field.
• Experience in software development and full product life-cycle.
• Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
• Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
• Proficiency in SQL and data modeling.



About Amazon.com

“Many of the problems we face have no textbook solution, and so we-happily-invent new ones.” – Jeff Bezos

Amazon.com – a place where builders can build. We hire the world's brightest minds and offer them an environment in which they can invent and innovate to improve the experience for our customers. A Fortune 100 company based in Seattle, Washington, Amazon is the global leader in e-commerce. Amazon offers everything from books and electronics to apparel and diamond jewelry. We operate sites in Australia, Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Netherlands, Spain, United Kingdom and United States, and maintain dozens of fulfillment centers around the world which encompass more than 26 million square feet.

Technological innovation drives the growth of Amazon, offering our customers more selection, convenient shopping, and low prices. Amazon Web Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, CloudFront and SimpleDB, that powers Amazon.com. Developers can build any type of business on Amazon Web Services and scale their application with growing business needs.

We want you to help share and shape our mission to be Earth's most customer-centric company. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do this every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun.


About Amazon India

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.



We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India.

Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

Thanks and Regards,
Regards,
Archana J
Recruiter (Tech) | Consumer TA
Read more
DP
Posted by Nithya Nagarathinam
icon
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Gurugram, India
icon
3 - 9 yrs
icon
₹1L - ₹15L / yr
Java
Data Structures
Algorithms
Scala
C++
+6 more

Role- Software Development Engineer-2

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
Agency job
via Response Informatics by Swagatika Sahoo
icon
Hyderabad, Pune, Chennai, Bengaluru (Bangalore), Mumbai
icon
5 - 6 yrs
icon
₹18L - ₹25L / yr
Big Data
Big Data Engineer
Spark
Apache Spark
Scala
+2 more

 

Experience :5--6+

 

 

Must Have

 

  • Apache Spark, Spark Streaming, Scala Programming, Apache HBASE   
  • Unix Scripting, SQL Knowledge
  • Good to Have.
  • Experience working with Graph Database preferably JANUS Graph DB
  • Experience working with Document Databases  and Apache SOLR

 

Job Description

Data Engineer  with Experience in the following area.

 

  • Designing and implementing high performance data ingestion pipelines from multiple sources using Scala and Apache Spark. 
  • Experience with event based  Spark Streaming technologies to ingest data. 
  • Developing Scalable and re-usable frameworks for ingesting data sets.
  • Integrating end to end data pipelines to take data from source systems to target data repositories ensuring the quality and consistency of data maintained at all times.
  • Preference for Big Data related Certifications like Cloudera Certified Professional CCP and Cloudera Certified Associate CCA
  • Working within Agile delivery methodology to deliver product implementation in iterative sprints.
  • Strong knowledge of Data Management principles

 

Location: PAN INDIA

Read more

Intergral Add Science

Agency job
via VIPSA TALENT SOLUTIONS by Prashma S R
icon
Pune
icon
5 - 8 yrs
icon
₹9L - ₹25L / yr
Java
Hadoop
Apache Spark
Scala
Python
+3 more
  • 6+ years of recent hands-on Java development
  • Developing data pipelines in AWS or Google Cloud
  • Java, Python, JavaScript programming languages
  • Great understanding of designing for performance, scalability, and reliability of data intensive application
  • Hadoop MapReduce, Spark, Pig. Understanding of database fundamentals and advanced SQL knowledge.
  • In-depth understanding of object oriented programming concepts and design patterns
  • Ability to communicate clearly to technical and non-technical audiences, verbally and in writing
  • Understanding of full software development life cycle, agile development and continuous integration
  • Experience in Agile methodologies including Scrum and Kanban
Read more

A leading software client in Pune

Agency job
via Sapwood Ventures by Sakshi G
icon
Pune
icon
3 - 8 yrs
icon
₹15L - ₹20L / yr
React.js
Redux/Flux
Java
Data Structures
Algorithms
+7 more
Job Title: Software Developer
Technologies: React JS
Experience: 3 to 8 years
Notice Period: Immediate joiner
Job Location: Kalyani Nager, Pune, MH
Job Summary
Looking for React JS developers who will be responsible for architecting and building applications, as
well as coordinating with the teams responsible for other layers of the product infrastructure.
Responsibilities and Duties:-

Responsible for development of new highly-responsive, web-based user interface

Build pixel-perfect, buttery smooth UIs across both mobile platforms.

Diagnose and fix bugs and performance bottlenecks for performance that feels native.

Reach out to the open source community to encourage and help implement mission-critical
software fixes—React Native moves fast and often breaks things.

Maintain code and write automated tests to ensure the product is of the highest quality.

Transition existing React web apps to React Native.

Construct visualizations that are able to depict vast amounts of data

Work and collaborate with the rest of the engineering team

Work with product team and graphic designers

Develop a flexible and well-structured front-end architecture, along with the APIs to support it
Required Experience, Skills and Qualifications:-
● Experience with automated testing suites, like (Jest or Mocha)
● Experience with JavaScript, REACT, HTML / CSS, REST API's
● Experience with Git knowledge is a plus
● Hands on Redux
●Familiarity with native build tools, like XCode, Gradle (Android Studio, IntelliJ)
●Understanding of REST APIs, the document request model, and offline storage
Read more
DP
Posted by Apurva kalsotra
icon
Mohali, Gurugram, Pune, Bengaluru (Bangalore), Hyderabad, Chennai
icon
3 - 8 yrs
icon
₹2L - ₹9L / yr
Data engineering
Data engineer
Spark
Apache Spark
Apache Kafka
+13 more

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
DP
Posted by Apurva kalsotra
icon
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
icon
3 - 8 yrs
icon
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
icon
Remote only
icon
2 - 15 yrs
icon
₹2L - ₹70L / yr
Data engineering
Data Engineer
Python
Big Data
Spark
+1 more
Proficiency in engineering practices and writing high quality code, with expertise in
either one of Java, Scala or Python
 Experience in Bigdata Technologies (Hadoop/Spark/Hive/Presto/HBase) & streaming
platforms (Kafka/NiFi/Storm)
 Experience in Distributed Search (Solr/Elastic Search), In-memory data-grid
(Redis/Ignite), Cloud native apps and Kubernetes is a plus
 Experience in building REST services and API’s following best practices of service
abstractions, Micro-services. Experience in Orchestration frameworks is a plus
 Experience in Agile methodology and CICD - tool integration, automation,
configuration management
 Added advantage for being a committer in one of the open-source Bigdata
technologies - Spark, Hive, Kafka, Yarn, Hadoop/HDFS
Read more
DP
Posted by Neeta Singh Mehta
icon
Bengaluru (Bangalore), Hyderabad, Noida, gurugram, Mumbai, Pune, NCR (Delhi | Gurgaon | Noida)
icon
3 - 8 yrs
icon
₹6L - ₹32L / yr
Java
Python
Ruby
PHP
C++
+9 more

Greetings from Amazon...!                                       

 

It is our pleasure in personally inviting you to apply job with Amazon Development Centre India (ADCI). At Amazon we are inclined to hire people with passion for technology and you happened to be one of the shortlisted candidates. Our business is committed to recognizing potential and creating teams that embrace innovation.

 

Please find the Eligible criteria and requirements:

 

Job title                             :             SDE – II (Software Development Engineer)
Role Opportunity
            :             Permanent/Full Time/FTE/Regular

Work Location                 :             Hyderabad/Bangalore/ Gurgaon

 

Must Have

  • Strong Exposure to Data Structures, Algorithms, Coding, System Design (LLD, HLD, OOAD), Distributed systems, problem solving skills, Architecture (MVC/Microservices), logical thinking.

Amazon (ADCI) - If you are looking for an opportunity to solve deep technical problems and build innovative solutions in a fast paced environment working with smart, passionate software developers, this might be the role for you. Amazon’s transportation systems get millions of packages to customers worldwide faster and cheaper while providing world class customer experience – from checkout to shipment tracking to delivery. Our software systems include services that handle thousands or requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve experience for millions of online shoppers. With rapid expansion into new geographies, innovations in supply chain, delivery models and customer experience, increasingly complex transportation network, ever expanding selection of products and growing number of shipments worldwide, we have an opportunity to build software that scales the business, leads the industry through innovation and delights millions of customers worldwide.

 

As an SDE, you will develop a deep understanding of our business, work closely with development teams and own the architecture and end-to-end delivery of software components.

About Amazon India:

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.

We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India. Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

 

Basic Qualifications:

 

  • 3+ years’ experience building successful production software systems
  • A solid grounding in Computer Science fundamentals (based on a BS or MS in CS or related field)
  • The ability to take convert raw requirements into good design while exploring technical feasibility tradeoffs
  • Expertise in System design (design patterns, LLD, HLD, Solid principle, OOAD, Distributed systems etc..), Architecture (MVC/Micro services).
  • Good understanding of at least some of the modern programming languages (Java) and open-source technologies (C++, Python, Scala, C#, PHP, Ruby etc..)
  • Excellence in technical communication
  • Has experience in mentoring other software developers

 

Preferred Qualifications:

 

  • BS/MS in Computer Science or equivalent
  • Experience developing service oriented architectures and an understanding of design for scalability, performance and reliability
  • Demonstrated ability to mentor other software developers to maintain architectural vision and software quality
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment
  • Expertise in delivering high-quality, innovative application
  • Strong desire to build, sense of ownership, urgency, and drive
  • Strong organizational and problem solving skills with great attention to detail
  • Ability to triage issues, react well to changes, work with teams and ability to multi-task on multiple products and projects.
  • Experience building highly scalable, high availability services
  • The ideal candidate will be a visionary leader, builder and operator.
  • He/she should have experience leading or contributing to multiple simultaneous product development efforts and initiatives.
  • He/she needs to balance technical leadership with strong business judgment to make the right decisions about technology choices.
  • He/she needs to be constantly striving for simplicity, and at the same time demonstrate significant creativity, innovation and judgment
  • Proficiency in, at least, one modern programming language.
  • Experience in SQL or Non-SQL database.
  • Strong sense of ownership, urgency, and drive.
  • Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices.
  • Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
  • Excellent communication, collaboration, reporting, analytical and problem solving skills.

 

 

Good to Have:

  • Knowledge of professional software engineering practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
  • Experience with enterprise-wide systems
  • Experience influencing software engineers best practices within your team
  • Hands-on expertise in many disparate technologies, typically ranging from front-end user interfaces through to back-end systems and all points in between
  • Strong written and verbal communication skills preferred

 

Key Points to remember:

 

  • Strong knowledge of the Software Development Life Cycle methodology
  • Technical design, development and implementation decisions on the use of technology in area(s) of specialization.
  • Write or modify programming code to suit customer's needs.
  • Unit test to assure meets requirements, including integration test as needed.
  • Ability to understand and analyze issues and uses judgment to make decisions.
  • Strong problem solving & troubleshooting skills
  • Strong communication skills
  • Responsible for self-development according to professional development plan

 

 

 

 

 

 

Read more
icon
Pune, Hyderabad
icon
7 - 12 yrs
icon
₹7L - ₹20L / yr
Apache Spark
Big Data
Spark
Scala
Hadoop
+3 more
We at Datametica Solutions Private Limited are looking for Big Data Spark Lead who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.

Job Description
Experience : 7+ years
Location : Pune / Hyderabad
Skills :
  • Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings
  • Participate and contribute in Solution Design and Solution Architecture for implementing Big Data Projects on-premise and on cloud
  • Technical Hands on experience in design, coding, development and managing Large Hadoop implementation
  • Proficient in SQL, Hive, PIG, Spark SQL, Shell Scripting, Kafka, Flume, Scoop with large Big Data and Data Warehousing projects with either Java, Python or Scala based Hadoop programming background
  • Proficient with various development methodologies like waterfall, agile/scrum and iterative
  • Good Interpersonal skills and excellent communication skills for US and UK based clients

About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.

We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.

Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.


We have our own products!
Eagle –
Data warehouse Assessment & Migration Planning Product
Raven –
Automated Workload Conversion Product
Pelican -
Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.

Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.

Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy

Check out more about us on our website below!
www.datametica.com
Read more

Service based company

Agency job
via Tech - Soul Technologies by Rohini Shinde
icon
Pune
icon
6 - 12 yrs
icon
₹6L - ₹28L / yr
Big Data
Apache Kafka
Data engineering
Cassandra
Java
+1 more

Primary responsibilities:

  • Architect, Design and Build high performance Search systems for personalization, optimization, and targeting
  • Designing systems with Solr, Akka, Cassandra, Kafka
  • Algorithmic development with primary focus Machine Learning
  • Working with rapid and innovative development methodologies like: Kanban, Continuous Integration and Daily deployments
  • Participation in design and code reviews and recommend improvements
  • Unit testing with JUnit, Performance testing and tuning
  • Coordination with internal and external teams
  • Mentoring junior engineers
  • Participate in Product roadmap and Prioritization discussions and decisions
  • Evangelize the solution with Professional services and Customer Success teams

 

Read more
icon
Pune
icon
4.5 - 10 yrs
icon
₹5L - ₹20L / yr
Scala
Java
Javascript
Akka
Play Framework

We are on a quest to find a Senior Software Developer - Scala with several years of experience who will help us in extending and maintaining our platform

 You will join a cross functional team with different nationalities, backgrounds and experience levels. The agile team is co-located including the Product Managers

You will collaborate with all team members in order to deliver the best solutions that enhance our platform

About Springer Nature India Pvt. Ltd:

 Springer Nature opens the doors to discovery for researchers, educators, clinicians and other professionals. Every day, around the globe, our imprints, books, journals, platforms and technology solutions reach millions of people. For over 175 years our brands and imprints have been a trusted source of knowledge to these communities and today, more than ever, we see it as our responsibility to ensure that fundamental knowledge can be found, verified, understood and used by our communities – enabling them to improve outcomes, make progress, and benefit the generations that follow.

Visit: group.springernature.com and follow @SpringerNature

 

If you are still wondering, why should you work with us. Here are 5 reasons why?

  1. Springer Nature is one of the world's largest publishing company. Nobel laureates publish their research at Springer.
  2. We are truly a digital organization and Springer Nature Pune is at the helm of this digitization.
  3. We not only believe but preach providing good work life balance to our employees.
  4. We are investing in building our products using machine learning and NLP.
  5. We work in latest technologies like AWS and Scala.

About the team:

Backend - Adis – PV is a scientific analysis platform being built for extracting content from scientific articles and providing meaningful insights

Insights are then published in a structured way, so that they can be made accessible to end users, via feeds delivery or through the platform

Backend - Adis – PV will be a production system for all databases under one IT landscape and under one umbrella.

Job Type: Permanent    
                                   

Job Location: Magarpatta City, Pune - India  (Work from home until further notice)

Years of Experience 6 to 10 years

What we are looking for

Educational Qualification:

B.Sc., BCA, BCS, B.E., B.Tech, M.Tech, MCA and M.Sc.

Skill Matrix:

Primary Language Skills: Java 8, Scala

Framework: Play Framework

Messaging: Rabbit MQ

Ideologies: TDD / ATDD, Pair Programming

Database: SQL and NoSQL

Challenges

  • You will help us continuously improving our platform
  • Together we will create best in class services that support the needs of our customers
  • Taking part of team ceremonies like grooming, planning and retrospective
  • Develop new features
  • Improve code quality by doing pair programming or code reviews
  • Continuously improve and monitor our product

Key Responsibilities

  • Own and consistently deliver high quality end-to-end product features keeping in view technical & business expectations
  • Add meaty features to the product which will deliver substantial business value
  • Pioneer clean coding and continuous code refactoring
  • Understands and appreciates existing design and architecture of the product
  • Understands pros and cons of various technology options available
  • Takes technical ownership of some of sub-systems of the product
  • Makes changes in the product designs to achieve required business value / mileage
  • Identify and addresses technical debts
  • Understands technical vision and road-map of the product and expectations
  • Understands purview of key pieces of deliverables and own few of these pieces

Day at work

  • Pioneer proof of concepts of new technologies keeping in view product road-map and business priorities
  • Self-study and share his / her learning within the team and across teams
  • Provide required help to other team-members
  • Pioneer in various team events and work towards objectives of these events
  • Make meaningful suggestions to make ceremonies more effective

About You

  • You have several years of experience with Software Development
  • You have worked successfully with product teams in the past and ideally have some experience with mentoring junior developers
  • You like working in a collaborative environment where there is collective ownership of the code
  • You work in a Continuous Integration and always strive for Continuous Delivery
  • You like to share and enable others to increase your whole team's performance
Read more

Fast paced Startup

Agency job
via Kavayah People Consulting by Kavita Singh
icon
Pune
icon
3 - 6 yrs
icon
₹15L - ₹22L / yr
Big Data
Data engineering
Hadoop
Spark
Apache Hive
+6 more

ears of Exp: 3-6+ Years 
Skills: Scala, Python, Hive, Airflow, Spark

Languages: Java, Python, Shell Scripting

GCP: BigTable, DataProc,  BigQuery, GCS, Pubsub

OR
AWS: Athena, Glue, EMR, S3, Redshift

MongoDB, MySQL, Kafka

Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time 

Read more
icon
Bengaluru (Bangalore), Hyderabad, Pune
icon
9 - 16 yrs
icon
₹7L - ₹32L / yr
Big Data
Scala
Spark
Hadoop
Python
+1 more
Greetings..
 
We have urgent requirement for the post of Big Data Architect in reputed MNC company
 
 


Location:  Pune/Nagpur,Goa,Hyderabad/Bangalore

Job Requirements:

  • 9 years and above of total experience preferably in bigdata space.
  • Creating spark applications using Scala to process data.
  • Experience in scheduling and troubleshooting/debugging Spark jobs in steps.
  • Experience in spark job performance tuning and optimizations.
  • Should have experience in processing data using Kafka/Pyhton.
  • Individual should have experience and understanding in configuring Kafka topics to optimize the performance.
  • Should be proficient in writing SQL queries to process data in Data Warehouse.
  • Hands on experience in working with Linux commands to troubleshoot/debug issues and creating shell scripts to automate tasks.
  • Experience on AWS services like EMR.
Read more
icon
Remote, Pune
icon
4 - 9 yrs
icon
₹5L - ₹25L / yr
Scala
ORM
Amazon Web Services (AWS)
JIRA Agile
Play Framework

Roles and responsibilities

    • Develop well-designed, performant and scalable microservices
    • Write reusable, testable, and efficient code that follow software development best practices
    • Integrate data storage solutions including databases, key-value stores, blob stores, etc.
    • Expose business functionality to frontend/mobile applications and partner systems through secure and scalable APIs.
    • Build integrations with 3rd party applications through apis’ to ingest and process data
    • Ensure security and data protection aspects within the applications
    • Contribute to devops by building CD/CI pipelines to automate releases
    • Ensure high performance and availability of distributed systems and applications
    • Interact directly with client project team members and operational staff to support live customer deployments and production issues.
Requirements
  • 4+ years of experience in developing applications using Scala and related technologies.
  • Thorough understanding of multithreading concepts and async execution using Actor model.
  • Thorough understanding of Play framework, GraphQL and GRPC technologies.
  • Experience in using DAL and ORM (Object Relational Mapper) libraries for data access.
  • Experiencing in developing and hosting APIs and integration with external applications.
  • Experience in building data models and repositories using relational and NoSql databases.
  • Knowledge of JIRA, Bitbucket and agile methodologies.
  • Good to have knowledge of AWS services like Lambda, dynamodb, kinesis and others.
  • Understanding of fundamental design principles behind a scalable application.
  • Familiarity with event-driven programming and distributed architectures.
  • Strong unit test and debugging skills
  • Affinity for learning and applying new technologies and solving new problems
  • Effective organizational skills with strong attention to detail
  • Experience in working with docker is a plus
  • Comfortable in working with Unix/Linux environment
  • Strong communication skills — both written and verbal
Read more
icon
Pune
icon
1.5 - 2.5 yrs
icon
₹1L - ₹10L / yr
Java
Scala
Python
Maven
Oracle
+2 more
Mandatory Skills:
  • As a polyglot developer Ideally, you should have:
  • 1.5+ years of development experience using any of technology java, scala, python or any similar exciting technologies.
  • Hands-on experience in coding, and implementation of complex, custom-built applications Working knowledge of build tool like maven/sbt and code versioning systems like git/bitbucket/cvs/svn
  • Familiarity with few databases, like MySQL, Oracle, PostgreSQL, SQL Server, NoSQL etc Great OO skills, including strong design patterns knowledge
  • Good communication and ability to work in a consulting environment is essential
Good to Have:
  • Think through hard problems in a consultancy environment, and work with amazing people to make the solutions a reality Work in a dynamic, collaborative, non-hierarchical environment where your talent is valued over your job title or years of experience
  • Build custom software using the latest technologies and tools Craft your own career path
You'll be responsible for:
  • Providing solution to real problems in Bigdata world.
  • RnD on using the latest tools,techniques and cloud services.
  • Automating the manual-timetaking tasks.
  • Hands-on coding, usually in a pair programming environment.
  • Working in highly collaborative teams and building quality code
  • Working in lots of different domains and client environments
  • Understanding the business domain deeply
What we do:
We are a team of technology agnostic, passionate people who aim to provide solution to real world Bigdata problems.
We are building solutions that will help our customers to do automatic migration of their RDBMS systems to latest BIGDATA platforms and tools such as Spark, Apex, Flink etc. For more information do visit our products webpage.  
Read more
DP
Posted by Rashmi Poovaiah
icon
Bengaluru (Bangalore), Chennai, Pune
icon
4 - 10 yrs
icon
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
DP
Posted by Sudhir Patil
icon
Pune
icon
0 - 2 yrs
icon
₹2L - ₹4L / yr
Java
J2EE
Data Structures
Algorithms
Scala
+4 more

About the Company

 

Leap Info Systems Pvt. Ltd. - A software product company with its products and solutions in the convergent lighting controls and automation. Recently Leap acquired elitedali – worlds first Niagara based lighting controls and automation solution, from one of the leading US organization.

 

We are a passionate team on a mission to develop innovative controls and automation products. We are expanding our product development team and are in search of like-minded highly passionate team members who would like to contribute to the leading lighting controls and automation product. We have customers in India, Europe, USA, and Australia.

 

Eligibility

 

Any suitable graduates (0 to 2 years experienced) who meet prescribed qualities and job responsibilities. Preferred but not limited to engineering graduate in Computer/IT/Instrumentation/EnTC etc.
Job Responsibilities

 

Be a part of the product development team for maintaining as well as developing new products based on JAVA and Niagara Software framework.

 

Qualities

 

  • Self-Learner
  • Analytical skills
  • Able to work with Cross-functional team
  • Problem-solving approach
  • Good Team Player with Positive vibes

 

What you must have,

  • Hands-on programming at academic or at the hobby level.
  • Good knowledge of Java/J2EE related technologies
  • Good knowledge of solution-based approaches
  • Good communication skills - phone, email and in-person.

 

 

What you can expect from LEAP,

  • Positive conducive working environment to grow with cross-functional team members.
  • Flexi working hours with defined responsibilities.
  • Opportunity to work with leading open standard automation framework like Niagara Software, Lighting controls technologies like DALI, Wireless etc
  • Be the active part of an emerging global product company
Read more
icon
Pune
icon
2 - 6 yrs
icon
₹4L - ₹12L / yr
C++
C#
Algorithms
Java
Data Structures
+3 more
Work with developers to design algorithms and flowcharts Prepare GUI dummy screens for proposed Software development using Excel VBA. (To Give a overview how the software buttons and flow of information should happen) Coordination with Software Developer team to explain the criteria Produce clean, efficient code based on specifications Integrate software components and third-party programs Verify and deploy programs and systems Troubleshoot, debug and upgrade existing software Gather and evaluate user feedback Recommend and execute improvements Create technical documentation for reference and reporting Proven experience as a Software Developer, Software Engineer or similar role Familiarity with development methodologies Experience with software design and development in a test-driven environment Knowledge of coding languages (e.g. C#, C++) and frameworks/systems Ability to learn new languages and technologies Excellent communication skills Resourcefulness and troubleshooting aptitude Attention to detail Sound technical knowledge, thorough knowledge of all related codes and section details is desired. Thorough Knowledge of Design of Components of Residential / Commercial Structures is Desired. Accuracy In Following The Process & Jobs Is Required. Experience in Interaction with International Client Will Be Preferred
Read more
DP
Posted by Sandeep Chaudhary
icon
Pune
icon
2 - 5 yrs
icon
₹1L - ₹18L / yr
Hadoop
Spark
Apache Hive
Apache Flume
Java
+5 more
Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation
Read more
DP
Posted by Shekhar Singh kshatri
icon
Pune
icon
5 - 10 yrs
icon
₹5L - ₹5L / yr
Hadoop
Scala
Spark
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Read more
DP
Posted by Kunal Kishore
icon
Pune
icon
2 - 7 yrs
icon
₹5L - ₹17L / yr
Test Driven Development TDD)
Continuous Integration Tool
Git
RESTful APIs
Scala
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management
Read more
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort