39+ Scala Jobs in Hyderabad | Scala Job openings in Hyderabad
Apply to 39+ Scala Jobs in Hyderabad on CutShort.io. Explore the latest Scala Job opportunities across top companies like Google, Amazon & Adobe.
The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.
Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.
Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.
Actively participates with other consultants in problem-solving and approach development.
Responsibilities :
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.
Perform data analysis to validate data models and to confirm the ability to meet business needs.
Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.
Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.
Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.
Coordinate with Data Architects, Program Managers and participate in recurring meetings.
Help and mentor team members to understand the data model and subject areas.
Ensure that the team adheres to best practices and guidelines.
Requirements :
- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.
- Experience with Spark optimization/tuning/resource allocations
- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.
- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).
- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.
- Have a deep understanding of the various stacks and components of the Big Data ecosystem.
- Hands-on experience with Python is a huge plus
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
Big Data Engineer- SCALA
Required Skills:
1. Experience in building modern and scalable REST – based microservices using Scala, preferably with Play as MVC framework.
2. Expertise with functional programming using SCALA
3. Experience in implementing RESTful web services in Scala, Java or similar languages.
4. Experience with No SQL and SQL databases.
5. Experience in information retrieval and machine learning
6. Experience/ knowledge in big data using Scala spark, ML, Kafka, and Elastic search will be plus.
Multinational Company providing energy & Automation digital
Roles and Responsibilities
Multinational Company providing energy & Automation digital
Roles and Responsibilities
at Altimetrik
Big Data Engineer: 5+ yrs.
Immediate Joiner
- Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
- Experience in developing lambda functions with AWS Lambda
- Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
- Should be able to code in Python and Scala.
- Snowflake experience will be a plus
- We can start keeping Hadoop and Hive requirements as good to have or understanding of is enough rather than keeping it as a desirable requirement.
at Altimetrik
Bigdata with cloud:
Experience : 5-10 years
Location : Hyderabad/Chennai
Notice period : 15-20 days Max
1. Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
2. Experience in developing lambda functions with AWS Lambda
3. Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
4. Should be able to code in Python and Scala.
5. Snowflake experience will be a plus
at Altimetrik
-Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight.
-Experience in developing lambda functions with AWS Lambda.
-
Expertise with Spark/PySpark
– Candidate should be hands on with PySpark code and should be able to do transformations with Spark
-Should be able to code in Python and Scala.
-
Snowflake experience will be a plus
at Altimetrik
Experience in developing lambda functions with AWS Lambda
Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
Should be able to code in Python and Scala.
Snowflake experience will be a plus
at Altimetrik
- Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight
- Experience in developing lambda functions with AWS Lambda
- Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark
- Should be able to code in Python and Scala.
- Snowflake experience will be a plus
Skill- Spark and Scala along with Azure
Location - Pan India
Looking for someone Bigdata along with Azure
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
This company provides on-demand cloud computing platforms.
- 15+ years of Hands-on technical application architecture experience and Application build/ modernization experience
- 15+ years of experience as a technical specialist in Customer-facing roles.
- Ability to travel to client locations as needed (25-50%)
- Extensive experience architecting, designing and programming applications in an AWS Cloud environment
- Experience with designing and building applications using AWS services such as EC2, AWS Elastic Beanstalk, AWS OpsWorks
- Experience architecting highly available systems that utilize load balancing, horizontal scalability and high availability
- Hands-on programming skills in any of the following: Python, Java, Node.js, Ruby, .NET or Scala
- Agile software development expert
- Experience with continuous integration tools (e.g. Jenkins)
- Hands-on familiarity with CloudFormation
- Experience with configuration management platforms (e.g. Chef, Puppet, Salt, or Ansible)
- Strong scripting skills (e.g. Powershell, Python, Bash, Ruby, Perl, etc.)
- Strong practical application development experience on Linux and Windows-based systems
- Extra curricula software development passion (e.g. active open source contributor)
• Experience with Advanced SQL
• Experience with Azure data factory, data bricks,
• Experience with Azure IOT, Cosmos DB, BLOB Storage
• API management, FHIR API development,
• Proficient with Git and CI/CD best practices
• Experience working with Snowflake is a plus
Job Description
Mandatory Requirements
-
Experience in AWS Glue
-
Experience in Apache Parquet
-
Proficient in AWS S3 and data lake
-
Knowledge of Snowflake
-
Understanding of file-based ingestion best practices.
-
Scripting language - Python & pyspark
CORE RESPONSIBILITIES
-
Create and manage cloud resources in AWS
-
Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
-
Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
-
Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
-
Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
-
Define process improvement opportunities to optimize data collection, insights and displays.
-
Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
-
Identify and interpret trends and patterns from complex data sets
-
Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
-
Key participant in regular Scrum ceremonies with the agile teams
-
Proficient at developing queries, writing reports and presenting findings
-
Mentor junior members and bring best industry practices.
QUALIFICATIONS
-
5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
-
Strong background in math, statistics, computer science, data science or related discipline
-
Advanced knowledge one of language: Java, Scala, Python, C#
-
Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
-
Proficient with
-
Data mining/programming tools (e.g. SAS, SQL, R, Python)
-
Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
-
Data visualization (e.g. Tableau, Looker, MicroStrategy)
-
Comfortable learning about and deploying new technologies and tools.
-
Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
-
Good written and oral communication skills and ability to present results to non-technical audiences
-
Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
-
AWS certification
-
Spark Streaming
-
Kafka Streaming / Kafka Connect
-
ELK Stack
-
Cassandra / MongoDB
-
CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
We have urgent requirement of Data Engineer/Sr Data Engineer for reputed MNC company.
Exp: 4-9yrs
Location: Pune/Bangalore/Hyderabad
Skills: We need candidate either Python AWS or Pyspark AWS or Spark Scala
at Persistent Systems
We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.
Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs
Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
- You write high quality, maintainable, and robust code, often in Java or C++/C/Python/ROR/C#
- You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
- You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
- You solve problems at their root, stepping back to understand the broader context.
- You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
- You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
- You recognize and use design patterns to solve business problems.
- You understand how operating systems work, perform and scale.
- You continually align your work with Amazon’s business objectives and seek to deliver business value.
- You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
- You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
- You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
- Bachelors or Masters in Computer Science or relevant technical field.
- Experience in software development and full product life-cycle.
- Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
- Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
- Proficiency in SQL and data modeling.
Role- Software Development Engineer-2
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
- You write high quality, maintainable, and robust code, often in Java or C++ or C#
- You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
- You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
- Roles & Responsibilities
- You solve problems at their root, stepping back to understand the broader context.
- You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
- You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
- You recognize and use design patterns to solve business problems.
- You understand how operating systems work, perform and scale.
- You continually align your work with Amazon’s business objectives and seek to deliver business value.
- You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
- You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
- You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
- Bachelors or Masters in Computer Science or relevant technical field.
- Experience in software development and full product life-cycle.
- Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
- Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
- Proficiency in SQL and data modeling.
About Amazon.com
“Many of the problems we face have no textbook solution, and so we-happily-invent new ones.” – Jeff Bezos
Amazon.com – a place where builders can build. We hire the world's brightest minds and offer them an environment in which they can invent and innovate to improve the experience for our customers. A Fortune 100 company based in Seattle, Washington, Amazon is the global leader in e-commerce. Amazon offers everything from books and electronics to apparel and diamond jewelry. We operate sites in Australia, Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Netherlands, Spain, United Kingdom and United States, and maintain dozens of fulfillment centers around the world which encompass more than 26 million square feet.
Technological innovation drives the growth of Amazon, offering our customers more selection, convenient shopping, and low prices. Amazon Web Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, CloudFront and SimpleDB, that powers Amazon.com. Developers can build any type of business on Amazon Web Services and scale their application with growing business needs.
We want you to help share and shape our mission to be Earth's most customer-centric company. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do this every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun.
About Amazon India
Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.
We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched http://www.amazon.in">www.amazon.in for shoppers in India. With http://www.amazon.in">www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.
We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India.
Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.
Software Development Engineer – SDE 2.
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
You write high quality, maintainable, and robust code, often in Java or C++ or C#
You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
You solve problems at their root, stepping back to understand the broader context.
You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
You recognize and use design patterns to solve business problems.
You understand how operating systems work, perform and scale.
You continually align your work with Amazon’s business objectives and seek to deliver business value.
You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
Bachelors or Masters in Computer Science or relevant technical field.
Experience in software development and full product life-cycle.
Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
Proficiency in SQL and data modeling.
Please find below JD and do reply with updated resume if you are interested.
Software Development Engineer
Bengaluru / Hyderabad / Chennai / Delhi
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
• You write high quality, maintainable, and robust code, often in Java or C++.
• You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
• You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
• You solve problems at their root, stepping back to understand the broader context.
• You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
• You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
• You recognize and use design patterns to solve business problems.
• You understand how operating systems work, perform and scale.
• You continually align your work with Amazon’s business objectives and seek to deliver business value.
• You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
• You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
• You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
• Bachelors or Masters in Computer Science or relevant technical field.
• Experience in software development and full product life-cycle.
• Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
• Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
• Proficiency in SQL and data modeling.
About Amazon.com
“Many of the problems we face have no textbook solution, and so we-happily-invent new ones.” – Jeff Bezos
Amazon.com – a place where builders can build. We hire the world's brightest minds and offer them an environment in which they can invent and innovate to improve the experience for our customers. A Fortune 100 company based in Seattle, Washington, Amazon is the global leader in e-commerce. Amazon offers everything from books and electronics to apparel and diamond jewelry. We operate sites in Australia, Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Netherlands, Spain, United Kingdom and United States, and maintain dozens of fulfillment centers around the world which encompass more than 26 million square feet.
Technological innovation drives the growth of Amazon, offering our customers more selection, convenient shopping, and low prices. Amazon Web Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, CloudFront and SimpleDB, that powers Amazon.com. Developers can build any type of business on Amazon Web Services and scale their application with growing business needs.
We want you to help share and shape our mission to be Earth's most customer-centric company. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do this every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun.
About Amazon India
Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.
We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.
We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India.
Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.
Thanks and Regards,
Regards,
Archana J
Recruiter (Tech) | Consumer TA
Role- Software Development Engineer-2
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.
Top Skills
You write high quality, maintainable, and robust code, often in Java or C++ or C#
You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities
You solve problems at their root, stepping back to understand the broader context.
You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
You recognize and use design patterns to solve business problems.
You understand how operating systems work, perform and scale.
You continually align your work with Amazon’s business objectives and seek to deliver business value.
You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
You communicate clearly with your team and with other groups and listen effectively.
Skills & Experience
Bachelors or Masters in Computer Science or relevant technical field.
Experience in software development and full product life-cycle.
Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
Proficiency in SQL and data modeling.
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Greetings from Amazon...!
It is our pleasure in personally inviting you to apply job with Amazon Development Centre India (ADCI). At Amazon we are inclined to hire people with passion for technology and you happened to be one of the shortlisted candidates. Our business is committed to recognizing potential and creating teams that embrace innovation.
Please find the Eligible criteria and requirements:
Job title : SDE – II (Software Development Engineer)
Role Opportunity : Permanent/Full Time/FTE/Regular
Work Location : Hyderabad/Bangalore/ Gurgaon
Must Have
- Strong Exposure to Data Structures, Algorithms, Coding, System Design (LLD, HLD, OOAD), Distributed systems, problem solving skills, Architecture (MVC/Microservices), logical thinking.
Amazon (ADCI) - If you are looking for an opportunity to solve deep technical problems and build innovative solutions in a fast paced environment working with smart, passionate software developers, this might be the role for you. Amazon’s transportation systems get millions of packages to customers worldwide faster and cheaper while providing world class customer experience – from checkout to shipment tracking to delivery. Our software systems include services that handle thousands or requests per second, make business decisions impacting billions of dollars a year, integrate with a network of small and large carriers worldwide, manage business rules for millions of unique products, and improve experience for millions of online shoppers. With rapid expansion into new geographies, innovations in supply chain, delivery models and customer experience, increasingly complex transportation network, ever expanding selection of products and growing number of shipments worldwide, we have an opportunity to build software that scales the business, leads the industry through innovation and delights millions of customers worldwide.
As an SDE, you will develop a deep understanding of our business, work closely with development teams and own the architecture and end-to-end delivery of software components.
About Amazon India:
Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.
We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched http://www.amazon.in">www.amazon.in for shoppers in India. With http://www.amazon.in">www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.
We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India. Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.
Basic Qualifications:
- 3+ years’ experience building successful production software systems
- A solid grounding in Computer Science fundamentals (based on a BS or MS in CS or related field)
- The ability to take convert raw requirements into good design while exploring technical feasibility tradeoffs
- Expertise in System design (design patterns, LLD, HLD, Solid principle, OOAD, Distributed systems etc..), Architecture (MVC/Micro services).
- Good understanding of at least some of the modern programming languages (Java) and open-source technologies (C++, Python, Scala, C#, PHP, Ruby etc..)
- Excellence in technical communication
- Has experience in mentoring other software developers
Preferred Qualifications:
- BS/MS in Computer Science or equivalent
- Experience developing service oriented architectures and an understanding of design for scalability, performance and reliability
- Demonstrated ability to mentor other software developers to maintain architectural vision and software quality
- Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment
- Expertise in delivering high-quality, innovative application
- Strong desire to build, sense of ownership, urgency, and drive
- Strong organizational and problem solving skills with great attention to detail
- Ability to triage issues, react well to changes, work with teams and ability to multi-task on multiple products and projects.
- Experience building highly scalable, high availability services
- The ideal candidate will be a visionary leader, builder and operator.
- He/she should have experience leading or contributing to multiple simultaneous product development efforts and initiatives.
- He/she needs to balance technical leadership with strong business judgment to make the right decisions about technology choices.
- He/she needs to be constantly striving for simplicity, and at the same time demonstrate significant creativity, innovation and judgment
- Proficiency in, at least, one modern programming language.
- Experience in SQL or Non-SQL database.
- Strong sense of ownership, urgency, and drive.
- Demonstrated leadership abilities in an engineering environment in driving operational excellence and best practices.
- Demonstrated ability to achieve stretch goals in a highly innovative and fast paced environment.
- Excellent communication, collaboration, reporting, analytical and problem solving skills.
Good to Have:
- Knowledge of professional software engineering practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Experience with enterprise-wide systems
- Experience influencing software engineers best practices within your team
- Hands-on expertise in many disparate technologies, typically ranging from front-end user interfaces through to back-end systems and all points in between
- Strong written and verbal communication skills preferred
Key Points to remember:
- Strong knowledge of the Software Development Life Cycle methodology
- Technical design, development and implementation decisions on the use of technology in area(s) of specialization.
- Write or modify programming code to suit customer's needs.
- Unit test to assure meets requirements, including integration test as needed.
- Ability to understand and analyze issues and uses judgment to make decisions.
- Strong problem solving & troubleshooting skills
- Strong communication skills
- Responsible for self-development according to professional development plan
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 7+ years
Location : Pune / Hyderabad
Skills :
- Drive and participate in requirements gathering workshops, estimation discussions, design meetings and status review meetings
- Participate and contribute in Solution Design and Solution Architecture for implementing Big Data Projects on-premise and on cloud
- Technical Hands on experience in design, coding, development and managing Large Hadoop implementation
- Proficient in SQL, Hive, PIG, Spark SQL, Shell Scripting, Kafka, Flume, Scoop with large Big Data and Data Warehousing projects with either Java, Python or Scala based Hadoop programming background
- Proficient with various development methodologies like waterfall, agile/scrum and iterative
- Good Interpersonal skills and excellent communication skills for US and UK based clients
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
www.datametica.com
at Persistent Systems
Location: Pune/Nagpur,Goa,Hyderabad/
Job Requirements:
- 9 years and above of total experience preferably in bigdata space.
- Creating spark applications using Scala to process data.
- Experience in scheduling and troubleshooting/debugging Spark jobs in steps.
- Experience in spark job performance tuning and optimizations.
- Should have experience in processing data using Kafka/Pyhton.
- Individual should have experience and understanding in configuring Kafka topics to optimize the performance.
- Should be proficient in writing SQL queries to process data in Data Warehouse.
- Hands on experience in working with Linux commands to troubleshoot/debug issues and creating shell scripts to automate tasks.
- Experience on AWS services like EMR.
Expereince: 3-5 years
Domain: SQL server/SSIS/Cloud technology
Good Knowledge in Creating new tables in database.
Using Triggers, trunket,delete,view,etc
We are looking for an intern with experience/knowledge in software design, coding and debugging for iOS and Android.
Positions open: 04
iOS Intern: 02 Android Intern: 02
Work Location: Hyderabad
Responsibilities
- Document and test new software applications.
- Research and assess new application ideas.
- Develop applications (coding, programming).
Requirements
- Any Graduate with good knowledge of C, C++, or Java.
- Excellent analytical and logical skills.
- Ability to work in teams.
Benefits
- Practical experience with a wide variety of software engineering tasks.
- Collaborating hand-in-hand tech team and direct interaction with the team leads and CTO.
- Shadowing, mentoring, and training opportunities with seasoned professionals.
- Opportunity to participate in networking events and company meetings.
at Turvo
JD:
Your role will include:
- Writing and testing your code, innovating and contributing towards increasing the value delivered by your team.
- Setting a high bar through your design, development, analysis and deployment activities
- Understanding and participating in evolving the architecture of our products.
- Keeping up-to-date with new technologies, best practices, and work on optimizing the tooling and automation.
- Understanding the latest development and engineering paradigms like Scrum/Agile/TDD/BDD/DDD etc.
You have experience with the following:
- Strong experience of leading and being part of technical teams preferably following agile methodology.
- Strong technical background with ability to provide technical guidance to other team members.
- Knowledge of microservices and must have experience of implementing a few microservices by the least.
- Knowledge of API driven platform development & Software Integration.
- You have hands-on experience in building secure, high-performing and scalable systems in Java.
- Exposure to JVM based languages like Java, Scala, Clojure.
- 5+ years of experience in a Data Engineer role
- Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases such as Cassandra.
- Experience with AWS cloud services: EC2, EMR, Athena
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Advanced SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with unstructured datasets.
- Deep problem-solving skills to perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
at Qvantel Software Solutions Ltd
Requirements:
- Academic degree (BE / MCA) with 3-10 years of experience in back-end Development.
- Strong knowledge of OOPS concepts, Analyzing, Designing, Development and Unit testing
- Scala technologies, AKKA, REST Webservices, SOAP, Jackson JSON API, JUnit, Mockito, Maven
- Hands-on experience with Play framework
- Familiarity with Microservice Architecture
- Experience working with Apache Tomcat server or TomEE
- Experience working with SQL databases (MySQL, PostgreSQL, Cassandra), writing custom queries, procedures, and designing schemas.
- Good to have front end experience (JavaScript, Angular JS/ React JS)
Benifits
Support for Continuous learning
Competetive Salary
Quarterly webinars and Annual conferences