Cutshort logo
Scala Jobs in Chennai

28+ Scala Jobs in Chennai | Scala Job openings in Chennai

Apply to 28+ Scala Jobs in Chennai on CutShort.io. Explore the latest Scala Job opportunities across top companies like Google, Amazon & Adobe.

icon
Delivery Solutions
Chennai
5 - 13 yrs
₹12L - ₹28L / yr
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconNodeJS (Node.js)
skill iconPython
+8 more

About UPS:


Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPS’s India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics.

‘Future You’ grows as a visible and valued Technology professional with UPS, driving us towards an exciting tomorrow. As a global Technology organization we can put serious resources behind your development. If you are solutions orientated, UPS Technology is the place for you. ‘Future You’ delivers ground-breaking solutions to some of the biggest logistics challenges around the globe. You’ll take technology to unimaginable places and really make a difference for UPS and our customers.


Job Summary:

This position participates in the support of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of data science practices for Marketing and Finance business units. This position supports the integration of data from various data sources, as well as performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to support reusable and reproducible data assets.


RESPONSIBILITIES

• Supervises and supports data engineering projects and builds solutions by leveraging a strong foundational knowledge in software/application development. He/she is hands on.

• Develops and delivers data engineering documentation.

• Gathers requirements, defines the scope, and performs the integration of data for data engineering projects.

• Recommends analytic reporting products/tools and supports the adoption of emerging technology.

• Performs data engineering maintenance and support.

• Provides the implementation strategy and executes backup, recovery, and technology solutions to perform analysis.

• Performs ETL tool capabilities with the ability to pull data from various sources and perform a load of the transformed data into a database or business intelligence platform.

• Codes using programming language used for statistical analysis and modeling such as Python/Spark


REQUIRED QUALIFICATIONS:

• Literate in the programming languages used for statistical modeling and analysis, data warehousing and Cloud solutions, and building data pipelines.

• Proficient in developing notebooks in Data bricks using Python and Spark and Spark SQL.

• Strong understanding of a cloud services platform (e.g., GCP, or AZURE, or AWS) and all the data life cycle stages. Azure is preferred

• Proficient in using Azure Data Factory and other Azure features such as LogicApps.

• Preferred to have knowledge of Delta lake, Lakehouse and Unity Catalog concepts.

• Strong understanding of cloud-based data lake systems and data warehousing solutions.

• Has used AGILE concepts for development, including KANBAN and Scrums

• Strong understanding of the data interconnections between organizations’ operational and business functions.

• Strong understanding of the data life cycle stages - data collection, transformation, analysis, storing the data securely, providing data accessibility

• Strong understanding of the data environment to ensure that it can scale for the following demands: Throughput of data, increasing data pipeline throughput, analyzing large amounts of data, Real-time predictions, insights and customer feedback, data security, data regulations, and compliance.

• Strong knowledge of algorithms and data structures, as well as data filtering and data optimization.

• Strong understanding of analytic reporting technologies and environments (e.g., Power BI, Looker, Qlik, etc.)

• Understanding of distributed systems and the underlying business problem being addressed, as well as guides team members on how their work will assist by performing data analysis and presenting findings to the stakeholders. •

REQUIRED SKILLS:

 3 years of experience with Databricks, Apache Spark, Python, and SQL


Preferred SKILLS:

 DeltaLake Unity Catalog, R, Scala, Azure Logic Apps, Cloud Services Platform (e.g., GCP, or AZURE, or AWS), and AGILE concepts.

Read more
Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Priya R
Posted by Priya R
Chennai, Bengaluru (Bangalore), Pune
8 - 14 yrs
Best in industry
Data engineering
skill iconPython
skill iconScala
databricks
Apache Spark
+3 more

About Us

Sahaj Software is an artisanal software engineering firm built on the values of trust, respect, curiosity, and craftsmanship, and delivering purpose-built solutions to drive data-led transformation for organisations. Our emphasis is on craft as we create purpose-built solutions, leveraging Data Engineering, Platform Engineering and Data Science with a razor-sharp focus to solve complex business and technology challenges and provide customers with a competitive edge


About The Role

As a Data Engineer, you’ll feel at home if you are hands-on, grounded, opinionated and passionate about delivering comprehensive data solutions that align with modern data architecture approaches. Your work will range from building a full data platform to building data pipelines or helping with data architecture and strategy. This role is ideal for those looking to have a large impact and huge scope for growth, while still being hands-on with technology. We aim to allow growth without becoming “post-technical”.


Responsibilities

  • Collaborate with Data Scientists and Engineers to deliver production-quality AI and Machine Learning systems
  • Build frameworks and supporting tooling for data ingestion from a complex variety of sources
  • Consult with our clients on data strategy, modernising their data infrastructure, architecture and technology
  • Model their data for increased visibility and performance
  • You will be given ownership of your work, and are encouraged to propose alternatives and make a case for doing things differently; our clients trust us and we manage ourselves.
  • You will work in short sprints to deliver working software
  • You will be working with other data engineers in Sahaj and work on building Data Engineering capability across the organisation


You can read more about what we do and how we think here: https://sahaj.ai/client-stories/


Skills you’ll need

  • Demonstrated experience as a Senior Data Engineer in complex enterprise environments
  • Deep understanding of technology fundamentals and experience with languages like Python, or functional programming languages like Scala
  • Demonstrated experience in the design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake
  • Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical
  • A nuanced understanding of code quality, maintainability and practices like Test Driven Development
  • Ability to deliver an application end to end; having an opinion on how your code should be built, packaged and deployed using CI/CD
  • Understanding of Cloud platforms, DevOps, GitOps, and Containers


What will you experience as a culture at Sahaj?

At Sahaj, people's collective stands for a shared purpose where everyone owns the dreams, ideas, ideologies, successes, and failures of the organisation - a synergy that is rooted in the ethos of honesty, respect, trust, and equitability. At Sahaj, you will experience

  • Creativity
  • Ownership
  • Curiosity
  • Craftsmanship
  • A culture of trust, respect and transparency
  • Opportunity to collaborate with some of the finest minds in the industry
  • Work across multiple domains


What are the benefits of being at Sahaj?

  •  Unlimited leaves
  •  Life Insurance & Private Health insurance paid by Sahaj
  • Stock options
  • No hierarchy
  • Open Salaries 


Read more
Sahaj AI Software

at Sahaj AI Software

1 video
6 recruiters
Soumya  Tripathy
Posted by Soumya Tripathy
Bengaluru (Bangalore), Chennai
11 - 17 yrs
Best in industry
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more

About the role

As a full-stack engineer, you’ll feel at home if you are hands-on, grounded, opinionated and passionate about building things using technology. Our tech stack ranges widely with language ecosystems like Typescript, Java, Scala, Golang, Kotlin, Elixir, Python, .Net, Nodejs and even Rust.

This role is ideal for those looking to have a large impact and a huge scope for growth while still being hands-on with technology. We aim to allow growth without becoming “post-technical”. We are extremely selective with our consultants and are able to run our teams with fewer levels of management. You won’t find a BA or iteration manager here! We work in small pizza teams of 2-5 people where a well-founded argument holds more weight than the years of experience. You will have the opportunity to work with clients across domains like retail, banking, publishing, education, ad tech and more where you will take ownership of developing software solutions that are purpose-built to solve our clients’ unique business and technical needs.

Responsibilities

  • Produce high-quality code that allows us to put solutions into production.
  • Utilize DevOps tools and practices to build and deploy software.
  • Collaborate with Data Scientists and Engineers to deliver production-quality AI and Machine Learning systems.
  • Build frameworks and supporting tooling for data ingestion from a complex variety of sources. Work in short sprints to deliver working software with clear deliverables and client-led deadlines.
  • Willingness to be a polyglot developer and learn multiple technologies.

Skills you’ll need

  • A maker’s mindset. To be resourceful and have the ability to do things that have no instructions.
  • Extensive experience (at least 10 years) as a Software Engineer.
  • Deep understanding of programming fundamentals and expertise with at least one programming language (functional or object-oriented).
  • A nuanced and rich understanding of code quality, maintainability and practices like Test Driven Development.
  • Experience with one or more source control and build toolchains.
  • Working knowledge of CI/CD will be an added advantage.
  • Understanding of web APIs, contracts and communication protocols.
  • Understanding of Cloud platforms, infra-automation/DevOps, IaC/GitOps/Containers, design and development of large data platforms.

What will you experience in terms of culture at Sahaj?

  • A culture of trust, respect and transparency
  • Opportunity to collaborate with some of the finest minds in the industry
  • Work across multiple domains

What are the benefits of being at Sahaj?

  • Unlimited leaves
  • Life Insurance & Private Health insurance paid by Sahaj
  • Stock options
  • No hierarchy
  • Open Salaries
Read more
Indian Based IT Service Organization

Indian Based IT Service Organization

Agency job
via People First Consultants by Aishwarya KA
Chennai, Tirunelveli
5 - 7 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+7 more

Greetings!!!!


We are looking for a data engineer for one of our premium clients for their Chennai and Tirunelveli location


Required Education/Experience


● Bachelor’s degree in computer Science or related field

● 5-7 years’ experience in the following:

● Snowflake, Databricks management,

● Python and AWS Lambda

● Scala and/or Java

● Data integration service, SQL and Extract Transform Load (ELT)

● Azure or AWS for development and deployment

● Jira or similar tool during SDLC

● Experience managing codebase using Code repository in Git/GitHub or Bitbucket

● Experience working with a data warehouse.

● Familiarity with structured and semi-structured data formats including JSON, Avro, ORC, Parquet, or XML

● Exposure to working in an agile work environment


Read more
Chennai, Hyderabad
5 - 10 yrs
₹10L - ₹25L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more

Bigdata with cloud:

 

Experience : 5-10 years

 

Location : Hyderabad/Chennai

 

Notice period : 15-20 days Max

 

1.  Expertise in building AWS Data Engineering pipelines with AWS Glue -> Athena -> Quick sight

2.  Experience in developing lambda functions with AWS Lambda

3.  Expertise with Spark/PySpark – Candidate should be hands on with PySpark code and should be able to do transformations with Spark

4.  Should be able to code in Python and Scala.

5.  Snowflake experience will be a plus

Read more
HCL Technologies

at HCL Technologies

3 recruiters
Agency job
via Saiva System by Sunny Kumar
Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Bengaluru (Bangalore), Hyderabad, Chennai, Pune, Mumbai, Kolkata
5 - 10 yrs
₹5L - ₹20L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+2 more
Exp- 5 + years
Skill- Spark and Scala along with Azure
Location - Pan India

Looking for someone Bigdata along with Azure
Read more
GradMener Technology Pvt. Ltd.
Pune, Chennai
5 - 9 yrs
₹15L - ₹20L / yr
skill iconScala
PySpark
Spark
SQL Azure
Hadoop
+4 more
  • 5+ years of experience in a Data Engineering role on cloud environment
  • Must have good experience in Scala/PySpark (preferably on data-bricks environment)
  • Extensive experience with Transact-SQL.
  • Experience in Data-bricks/Spark.
  • Strong experience in Dataware house projects
  • Expertise in database development projects with ETL processes.
  • Manage and maintain data engineering pipelines
  • Develop batch processing, streaming and integration solutions
  • Experienced in building and operationalizing large-scale enterprise data solutions and applications
  • Using one or more of Azure data and analytics services in combination with custom solutions
  • Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Cloud repositories for e.g. Azure GitHub, Git
  • Experience in an agile environment (Prefer Azure DevOps).

Good to have

  • Manage source data access security
  • Automate Azure Data Factory pipelines
  • Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
  • Experience in implementing and maintaining CICD pipelines
  • Power BI understanding, Delta Lake house architecture
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
  • Expertise in database development projects and ETL processes.
Read more
Tier 1 MNC

Tier 1 MNC

Agency job
Chennai, Pune, Bengaluru (Bangalore), Noida, Gurugram, Kochi (Cochin), Coimbatore, Hyderabad, Mumbai, Navi Mumbai
3 - 12 yrs
₹3L - ₹15L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+1 more
Greetings,
We are hiring for Tier 1 MNC for the software developer with good knowledge in Spark,Hadoop and Scala
Read more
Sopra Steria

Sopra Steria

Agency job
via Mount Talent Consulting by Himani Jain
Chennai, Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 8 yrs
₹2L - ₹12L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+1 more
Good hands-on experience on Spark and Scala.
Should have experience in Big Data, Hadoop.
Currently providing WFH.
immediate joiner or 30 days
Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Bengaluru (Bangalore), Chennai
12 - 15 yrs
₹50L - ₹60L / yr
skill iconData Science
skill iconMachine Learning (ML)
ETL
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
+5 more
Job Title: Data Architect
Job Location: Chennai

Job Summary
The Engineering team is seeking a Data Architect. As a Data Architect, you will drive a
Data Architecture strategy across various Data Lake platforms. You will help develop
reference architecture and roadmaps to build highly available, scalable and distributed
data platforms using cloud based solutions to process high volume, high velocity and
wide variety of structured and unstructured data. This role is also responsible for driving
innovation, prototyping, and recommending solutions. Above all, you will influence how
users interact with Conde Nast’s industry-leading journalism.
Primary Responsibilities
Data Architect is responsible for
• Demonstrated technology and personal leadership experience in architecting,
designing, and building highly scalable solutions and products.
• Enterprise scale expertise in data management best practices such as data integration,
data security, data warehousing, metadata management and data quality.
• Extensive knowledge and experience in architecting modern data integration
frameworks, highly scalable distributed systems using open source and emerging data
architecture designs/patterns.
• Experience building external cloud (e.g. GCP, AWS) data applications and capabilities is
highly desirable.
• Expert ability to evaluate, prototype and recommend data solutions and vendor
technologies and platforms.
• Proven experience in relational, NoSQL, ELT/ETL technologies and in-memory
databases.
• Experience with DevOps, Continuous Integration and Continuous Delivery technologies
is desirable.
• This role requires 15+ years of data solution architecture, design and development
delivery experience.
• Solid experience in Agile methodologies (Kanban and SCRUM)
Required Skills
• Very Strong Experience in building Large Scale High Performance Data Platforms.
• Passionate about technology and delivering solutions for difficult and intricate
problems. Current on Relational Databases and No sql databases on cloud.
• Proven leadership skills, demonstrated ability to mentor, influence and partner with
cross teams to deliver scalable robust solutions..
• Mastery of relational database, NoSQL, ETL (such as Informatica, Datastage etc) /ELT
and data integration technologies.
• Experience in any one of Object Oriented Programming (Java, Scala, Python) and
Spark.
• Creative view of markets and technologies combined with a passion to create the
future.
• Knowledge on cloud based Distributed/Hybrid data-warehousing solutions and Data
Lake knowledge is mandate.
• Good understanding of emerging technologies and its applications.
• Understanding of code versioning tools such as GitHub, SVN, CVS etc.
• Understanding of Hadoop Architecture and Hive SQL
• Knowledge in any one of the workflow orchestration
• Understanding of Agile framework and delivery

Preferred Skills:
● Experience in AWS and EMR would be a plus
● Exposure in Workflow Orchestration like Airflow is a plus
● Exposure in any one of the NoSQL database would be a plus
● Experience in Databricks along with PySpark/Spark SQL would be a plus
● Experience with the Digital Media and Publishing domain would be a
plus
● Understanding of Digital web events, ad streams, context models

About Condé Nast

CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right
move to invest heavily in understanding this data and formed a whole new Data team
entirely dedicated to data processing, engineering, analytics, and visualization. This team
helps drive engagement, fuel process innovation, further content enrichment, and increase
market revenue. The Data team aimed to create a company culture where data was the
common language and facilitate an environment where insights shared in real-time could
improve performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The
team at Condé Nast Chennai works extensively with data to amplify its brands' digital
capabilities and boost online revenue. We are broadly divided into four groups, Data
Intelligence, Data Engineering, Data Science, and Operations (including Product and
Marketing Ops, Client Services) along with Data Strategy and monetization. The teams built
capabilities and products to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are
Condé Nast, and It Starts Here.
Read more
Amagi Media Labs

at Amagi Media Labs

3 recruiters
Rajesh C
Posted by Rajesh C
Chennai
15 - 18 yrs
Best in industry
Data architecture
Architecture
Data Architect
Architect
skill iconJava
+5 more
Job Title: Data Architect
Job Location: Chennai
Job Summary

The Engineering team is seeking a Data Architect. As a Data Architect, you will drive a
Data Architecture strategy across various Data Lake platforms. You will help develop
reference architecture and roadmaps to build highly available, scalable and distributed
data platforms using cloud based solutions to process high volume, high velocity and
wide variety of structured and unstructured data. This role is also responsible for driving
innovation, prototyping, and recommending solutions. Above all, you will influence how
users interact with Conde Nast’s industry-leading journalism.
Primary Responsibilities
Data Architect is responsible for
• Demonstrated technology and personal leadership experience in architecting,
designing, and building highly scalable solutions and products.
• Enterprise scale expertise in data management best practices such as data integration,
data security, data warehousing, metadata management and data quality.
• Extensive knowledge and experience in architecting modern data integration
frameworks, highly scalable distributed systems using open source and emerging data
architecture designs/patterns.
• Experience building external cloud (e.g. GCP, AWS) data applications and capabilities is
highly desirable.
• Expert ability to evaluate, prototype and recommend data solutions and vendor
technologies and platforms.
• Proven experience in relational, NoSQL, ELT/ETL technologies and in-memory
databases.
• Experience with DevOps, Continuous Integration and Continuous Delivery technologies
is desirable.
• This role requires 15+ years of data solution architecture, design and development
delivery experience.
• Solid experience in Agile methodologies (Kanban and SCRUM)
Required Skills
• Very Strong Experience in building Large Scale High Performance Data Platforms.
• Passionate about technology and delivering solutions for difficult and intricate
problems. Current on Relational Databases and No sql databases on cloud.
• Proven leadership skills, demonstrated ability to mentor, influence and partner with
cross teams to deliver scalable robust solutions..
• Mastery of relational database, NoSQL, ETL (such as Informatica, Datastage etc) /ELT
and data integration technologies.
• Experience in any one of Object Oriented Programming (Java, Scala, Python) and
Spark.
• Creative view of markets and technologies combined with a passion to create the
future.
• Knowledge on cloud based Distributed/Hybrid data-warehousing solutions and Data
Lake knowledge is mandate.
• Good understanding of emerging technologies and its applications.
• Understanding of code versioning tools such as GitHub, SVN, CVS etc.
• Understanding of Hadoop Architecture and Hive SQL
• Knowledge in any one of the workflow orchestration
• Understanding of Agile framework and delivery

Preferred Skills:
● Experience in AWS and EMR would be a plus
● Exposure in Workflow Orchestration like Airflow is a plus
● Exposure in any one of the NoSQL database would be a plus
● Experience in Databricks along with PySpark/Spark SQL would be a plus
● Experience with the Digital Media and Publishing domain would be a
plus
● Understanding of Digital web events, ad streams, context models
About Condé Nast
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and social
platforms - in other words, a staggering amount of user data. Condé Nast made the right
move to invest heavily in understanding this data and formed a whole new Data team
entirely dedicated to data processing, engineering, analytics, and visualization. This team
helps drive engagement, fuel process innovation, further content enrichment, and increase
market revenue. The Data team aimed to create a company culture where data was the
common language and facilitate an environment where insights shared in real-time could
improve performance.
The Global Data team operates out of Los Angeles, New York, Chennai, and London. The
team at Condé Nast Chennai works extensively with data to amplify its brands' digital
capabilities and boost online revenue. We are broadly divided into four groups, Data
Intelligence, Data Engineering, Data Science, and Operations (including Product and
Marketing Ops, Client Services) along with Data Strategy and monetization. The teams built
capabilities and products to create data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create diverse
forms of self-expression. At Condé Nast, we encourage the imaginative and celebrate the
extraordinary. We are a media company for the future, with a remarkable past. We are
Condé Nast, and It Starts Here.
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Akhil Ravipalli
Posted by Akhil Ravipalli
Bengaluru (Bangalore), Hyderabad, Delhi, Gurugram, Noida, Ghaziabad, Faridabad, Chennai, Pune
2 - 9 yrs
₹15L - ₹60L / yr
Systems design
Data Structures
Algorithms
skill iconJava
skill iconPython
+6 more

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

Top Skills

 

  • You write high quality, maintainable, and robust code, often in Java or C++/C/Python/ROR/C#
  • You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
  • You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

 

  • You solve problems at their root, stepping back to understand the broader context.
  • You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
  • You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
  • You recognize and use design patterns to solve business problems.
  • You understand how operating systems work, perform and scale.
  • You continually align your work with Amazon’s business objectives and seek to deliver business value.
  • You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
  • You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
  • You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

 

  • Bachelors or Masters in Computer Science or relevant technical field.
  • Experience in software development and full product life-cycle.
  • Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
  • Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
  • Proficiency in SQL and data modeling.
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Srilalitha K
Posted by Srilalitha K
Hyderabad, Bengaluru (Bangalore), Delhi, Gurugram, Pune, Chennai
3 - 9 yrs
₹2L - ₹15L / yr
skill iconC
skill iconC++
skill iconC#
skill iconPython
skill icon.NET
+14 more

Software Development Engineer – SDE 2.            

 

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

 

 Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

 

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
Amazon India

at Amazon India

1 video
58 recruiters
Archana J
Posted by Archana J
Bengaluru (Bangalore), Hyderabad, Delhi, Pune, Chennai
2 - 9 yrs
₹10L - ₹15L / yr
skill iconJava
Data Structures
Algorithms
skill iconScala
skill iconC++
+4 more
Hi,

Please find below JD and do reply with updated resume if you are interested.

Software Development Engineer
Bengaluru / Hyderabad / Chennai / Delhi
As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

• You write high quality, maintainable, and robust code, often in Java or C++.
• You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.
• You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.
Roles & Responsibilities

• You solve problems at their root, stepping back to understand the broader context.
• You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.
• You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.
• You recognize and use design patterns to solve business problems.
• You understand how operating systems work, perform and scale.
• You continually align your work with Amazon’s business objectives and seek to deliver business value.
• You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.
• You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.
• You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

• Bachelors or Masters in Computer Science or relevant technical field.
• Experience in software development and full product life-cycle.
• Excellent programming skills in any object oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.
• Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.
• Proficiency in SQL and data modeling.



About Amazon.com

“Many of the problems we face have no textbook solution, and so we-happily-invent new ones.” – Jeff Bezos

Amazon.com – a place where builders can build. We hire the world's brightest minds and offer them an environment in which they can invent and innovate to improve the experience for our customers. A Fortune 100 company based in Seattle, Washington, Amazon is the global leader in e-commerce. Amazon offers everything from books and electronics to apparel and diamond jewelry. We operate sites in Australia, Brazil, Canada, China, France, Germany, India, Italy, Japan, Mexico, Netherlands, Spain, United Kingdom and United States, and maintain dozens of fulfillment centers around the world which encompass more than 26 million square feet.

Technological innovation drives the growth of Amazon, offering our customers more selection, convenient shopping, and low prices. Amazon Web Services provides developers and small to large businesses access to the horizontally scalable state of the art cloud infrastructure like S3, EC2, AMI, CloudFront and SimpleDB, that powers Amazon.com. Developers can build any type of business on Amazon Web Services and scale their application with growing business needs.

We want you to help share and shape our mission to be Earth's most customer-centric company. Amazon's evolution from Web site to e-commerce partner to development platform is driven by the spirit of invention that is part of our DNA. We do this every day by inventing elegant and simple solutions to complex technical and business problems. We're making history and the good news is that we've only just begun.


About Amazon India

Amazon teams in India work on complex business challenges to innovate and create efficient solutions that enable various Amazon businesses, including Amazon websites across the world as well as support Payments, Transportation, and Digital products and services like the Kindle family of tablets, e-readers and the store. We are proud to have some of the finest talent and strong leaders with proven experience working to make Amazon the Earth’s most customer-centric company.

We made our foray into the Indian market with the launch of Junglee.com, enabling retailers in India to advertise their products to millions of Indian shoppers and drive targeted traffic to their stores. In June 2013, we launched www.amazon.in for shoppers in India. With www.amazon.in, we endeavor to give customers more of what they want – low prices, vast selection, fast and reliable delivery, and a trusted and convenient online shopping experience. In just over a year of launching our India operations, we have expanded our offering to over 18 million products across 36 departments and 100s of categories! Our philosophy of working backwards from the customers is what drives our growth and success.



We will continue to strive to become a trusted and meaningful sales and logistics channel for retailers of all sizes across India and a fast, reliable and convenient online shopping destination for consumers. For us, it is always “Day 1” and we are committed to aggressively invest over the long-term and relentlessly focus on raising the bar for customer experience in India.

Amazon India offers opportunities where you can dive right in, work with smart people on challenging problems and make an impact that contributes to the lives of millions. Join us so you can - Work Hard, Have Fun and Make History.

Thanks and Regards,
Regards,
Archana J
Recruiter (Tech) | Consumer TA
Read more
Amazon India

at Amazon India

1 video
58 recruiters
Nithya Nagarathinam
Posted by Nithya Nagarathinam
Bengaluru (Bangalore), Chennai, Hyderabad, Pune, Gurugram, India
3 - 9 yrs
₹1L - ₹15L / yr
skill iconJava
Data Structures
Algorithms
skill iconScala
skill iconC++
+6 more

Role- Software Development Engineer-2

As a Software Development Engineer at Amazon, you have industry-leading technical abilities and demonstrate breadth and depth of knowledge. You build software to deliver business impact, making smart technology choices. You work in a team and drive things forward.

Top Skills

You write high quality, maintainable, and robust code, often in Java or C++ or C#

You recognize and adopt best practices in software engineering: design, testing, version control, documentation, build, deployment, and operations.

You have experience building scalable software systems that are high-performance, highly-available, highly transactional, low latency and massively distributed.

Roles & Responsibilities

You solve problems at their root, stepping back to understand the broader context.

You develop pragmatic solutions and build flexible systems that balance engineering complexity and timely delivery, creating business impact.

You understand a broad range of data structures and algorithms and apply them to deliver high-performing applications.

You recognize and use design patterns to solve business problems.

You understand how operating systems work, perform and scale.

You continually align your work with Amazon’s business objectives and seek to deliver business value.

You collaborate to ensure that decisions are based on the merit of the proposal, not the proposer.

You proactively support knowledge-sharing and build good working relationships within the team and with others in Amazon.

You communicate clearly with your team and with other groups and listen effectively.

Skills & Experience

Bachelors or Masters in Computer Science or relevant technical field.

Experience in software development and full product life-cycle.

Excellent programming skills in any object-oriented programming languages - preferably Java, C/C++/C#, Perl, Python, or Ruby.

Strong knowledge of data structures, algorithms, and designing for performance, scalability, and availability.

Proficiency in SQL and data modeling.

Read more
prevaj consultants pvt ltd
Nilofer Jamal
Posted by Nilofer Jamal
Remote, Chennai
8 - 12 yrs
₹30L - ₹50L / yr
skill iconJava
skill iconAmazon Web Services (AWS)
SaaS
Restructuring
skill iconScala
+1 more
  • Candidate Should have 5+ Years Of Experience in Core Java
  • You will need strong development skills to work on and improve our Scala-based services, and be able to work together with senior teammates to create appropriate architectural design and ensure all aspects are appropriate to meet the business need.
  • Excellent Functional Design and Functional Programming skills (more than 2 years of business experience in Scala and Java projects, respectively)
  • Core skills in key supporting technologies and/or frameworks such as Play (AKKA) / Lagom
  • Proven experience working in teams in the successful delivery of complex, performant and high quality products
  • Excellent spoken and written communication skills
  • Experience of SaaS (Software as a Service) environments
  • Exposure to RESTful web APIs and a service oriented architecture
  • Experience in Linux environments, Shell scripting etc
  • Working with XML and JSON including parsing, asserting / matching and extracting
  • Experience with Continuous Integration environments and build tools, including Terraform, Jenkins, Maven, Gradle and Ant
  • Experience with messaging systems such as Apache Kafka, Amazon Kinesis, Amazon SQS and Rabbit MQ
  • Experience working on Live platform SDKs such as Twilio, AWS Elemental
Read more
American Multinational Retail Corp

American Multinational Retail Corp

Agency job
via Hunt & Badge Consulting Pvt Ltd by Chandramohan Subramanian
Chennai
2 - 5 yrs
₹5L - ₹15L / yr
skill iconScala
Spark
Apache Spark

Should have Passion to learn and adapt new technologies, understanding,

solving/troubleshooting issues and risks, able to make informed decisions and ability to

lead the projects.

 

Your Qualifications

 

  • 2-5 Years’ Experience with functional programming
  • Experience with functional programming using Scala with Spark framework.
  • Strong understanding of Object-oriented programming, data structures and algorithms
  • Good experience in any of the cloud platforms (Azure, AWS, GCP) etc.,
  • Experience with distributed (multi-tiered) systems, relational databases and NoSql storage solutions
  • Desire to learn new technologies and languages
  • Participation in software design, development, and code reviews
  • High level of proficiency with Computer Science/Software Engineering knowledge and contribution to the technical skills growth of other team members


Your Responsibility

 

  • Design, build and configure applications to meet business process and application requirements
  • Proactively identify and communicate potential issues and concerns and recommend/implement alternative solutions as appropriate.
  • Troubleshooting & Optimization of existing solution

 

Provide advice on technical design to ensure solutions are forward looking and flexible for potential future requirements and business needs.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Pune, Bengaluru (Bangalore), Hyderabad, Chennai
3 - 8 yrs
₹2L - ₹9L / yr
Data engineering
Data engineer
Spark
Apache Spark
Apache Kafka
+13 more

Responsibilities for Data Engineer

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer

  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
  • Strong project management and organizational skills.
  • Experience supporting and working with cross-functional teams in a dynamic environment.
  • We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:

  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
  • Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • Experience with AWS cloud services: EC2, EMR, RDS, Redshift
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
CIEL HR Services

at CIEL HR Services

19 recruiters
Sivakumar S
Posted by Sivakumar S
Chennai, Bengaluru (Bangalore)
3 - 7 yrs
₹8L - ₹16L / yr
skill iconScala
Microservices
Test driven development (TDD)
CI/CD
HTTP
+16 more

As a Scala Developer, you are part of the development of the core applications using the Micro Service paradigm. You will join an Agile team, working closely with our product owner, building and delivering a set of Services as part of our order management and fulfilment platform. We deliver value to our business with every release, meaning that you will immediately be able to contribute and make a positive impact.

Our approach to technology is to use the right tool for the job and, through good software engineering practices such as TDD and CI/CD, to build high-quality solutions that are built with a view to maintenance. 

 

Requirements

The Role:

  • Build high-quality applications and HTTP based services.
  • Work closely with technical and non-technical colleagues to ensure the services we build meet the needs of the business.
  • Support development of a good understanding of business requirements and corresponding technical specifications.
  • Actively contribute to planning, estimation and implementation of team work.
  • Participate in code review and mentoring processes.
  • Identify and plan improvements to our services and systems.
  • Monitor and support production services and systems.
  • Keep up with industry trends and new tools, technologies & development methods with a view to adopting best practices that fit the team and promote adoption more widely.

 

Relevant Skills & Experience: 

The following skills and experience are relevant to the role and we are looking for someone who can hit the ground running in these areas.

  • Web service application development in Scala (essential)
  • Functional Programming (essential)
  • API development and microservice architecture (essential)
  • Patterns for building scalable, performant, distributed systems (essential)
  • Databases – we use PostgreSQL (essential)
  • Common libraries – we use Play, Cats and Slick (essential)
  • Strong communication and collaboration skills (essential)
  • Performance profiling and analysis of JVM based applications
  • Messaging frameworks and patterns
  • Testing frameworks and tools
  • Docker, virtualisation and cloud computing – we use AWS and Vmware
  • Javascript including common frameworks such as React, Angular, etc
  • Linux systems administration
  • Configuration tooling such as Puppet and Ansible
  • Continuous delivery tools and environments
  • Agile software delivery
  • Troubleshooting and diagnosing complex production issues

 

Benefits

  • Fun, happy and politics-free work culture built on the principles of lean and self organisation.
  • Work with large scale systems powering global businesses.
  • Competitive salary and benefits.

 

Note: We looking for immediate joiners. We expect the offered candidate should join within 15 days. Buyout reimbursement is available for 30 to 60 days notice period applicants who can ready join within 15 days.

To build on our success, we are looking for smart, conscientious software developers who want to work in a friendly, engaging environment and take our platform and products forward. In return, you will have the opportunity to work with the latest technologies, frameworks & methodologies in service development in an environment where we value collaboration and learning opportunities.

Read more
BUDDIAI

at BUDDIAI

1 video
2 recruiters
Dhana Lakshmi A
Posted by Dhana Lakshmi A
Chennai
3 - 10 yrs
₹4L - ₹15L / yr
skill iconJava
Data Structures
Algorithms
skill iconScala
skill iconC++
+7 more

Job description

 

We are looking for a passionate Software Development Engineer to develop, test, maintain and document program code in accordance with user requirements and system technical specifications. As a Software Development Engineer, you will work with other Developers and Product Managers throughout the software development life cycle.

Software Development Engineer responsibilities include analysing requirements, define system functionality and writing code in the companys current technology stack. The candidate is expected to be familiar with the software development life cycle (SDLC) process from preliminary system analysis to tests and deployment. Ultimately, the role of the Software Engineer is to build high-quality, innovative and fully performing software that complies with coding standards and technical design. Your goal will be to build efficient programs and systems that serve user needs.

To be qualified for this role, you should hold a minimum of Bachelor’s degree in a relevant field, like Computer Science, IT or Software Engineering. You should be a team player with a keen eye for detailed and problem-solving skills. If you also have experience in SDLC, Agile frameworks and popular coding languages (e.g., Java), strong computer science fundamentals we’d like to meet you.

Years of experience : 2 to 10 years.

Roles & Responsibilities

The overview of this position (based in Chennai, India) includes:

  • Develops, enhances, debugs, supports, maintains and tests software applications that support business units or supporting functions. These application program solutions may involve diverse development platforms, software, hardware, technologies and tools.
  • Participates in the design, development and implementation of complex applications, often using new technologies.
  • Technology professional with experience in designing and managing the implementation of future looking, flexible and reusable, enterprise applications and components.
  • Expert in translating business requirements into an application design that includes Data Model, Web Screens, Web Services, and batch processing.
  • May provide technical direction and system architecture for individual initiatives.
  • Serves as a fully seasoned/proficient technical resource.
  • Deploy programs, gather and evaluate user feedback
  • Recommend and execute improvements
  • Create technical documentation for reference and reporting
  • Develop software verification plans and quality assurance procedures
  • Document and maintain software functionality
  • Ensure software is updated with latest features
  • Good interpersonal and technology understanding skills
  • Evaluate open-source components and integrate into product pipeline

Skills and Qualifications

  • Hands-on experience in analysis, design, coding, and implementation of complex, custom-built applications.
  • Strong Java, development skills (JAVA, J2EE, STRUTS, SPRING, Web Services, Eclipse, UI screens, AngularJS, React.JS)
  • Excellent debugging skills
  • Strong knowledge on databases (MySQL, MSSQL Server and NoSQL databases)
  • Understanding of various deployment servers (Apache Tomcat is a must)
  • Understanding of OO skills, including strong design patterns knowledge is a must.
  • Strong understanding in creating and maintaining web services.
  • Understanding of the software development life cycle
  • Experience with Implementation and release management activities
  • Good understanding of unit/system and functional testing methodology
  • Experience working in large transaction-based systems
  • Knowledge of software best practices, like Test-Driven Development (TDD) and Continuous Integration (CI)
  • Experience documenting technical functions
  • Desire to contribute to the wider technical community through collaboration, coaching, and mentoring of other technologists.
  • Experience in Linux based systems, development of shell-based scripts.

Job Training

  • Training on the coding paradigms, guidelines, frameworks, usage of the applications would be provided by the engineers
  • Periodic training sessions would be conducted by the technical architects in terms of technology and skills to be learnt
  • Periodic, structured training would be provided on the applications Hours & Environment
  • Typical 40 hours of work a week
  • Depending on the requirements, work hours may have to be extended during the day, weekend
Read more
netmedscom

at netmedscom

3 recruiters
Vijay Hemnath
Posted by Vijay Hemnath
Chennai
2 - 5 yrs
₹6L - ₹25L / yr
Big Data
Hadoop
Apache Hive
skill iconScala
Spark
+12 more

We are looking for an outstanding Big Data Engineer with experience setting up and maintaining Data Warehouse and Data Lakes for an Organization. This role would closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Roles and Responsibilities:

  • Develop and maintain scalable data pipelines and build out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'Big Data' technologies.
  • Develop programs in Scala and Python as part of data cleaning and processing.
  • Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization.  
  • Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.
  • Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
  • Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Provide high operational excellence guaranteeing high availability and platform stability.
  • Closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.

Skills:

  • Experience with Big Data pipeline, Big Data analytics, Data warehousing.
  • Experience with SQL/No-SQL, schema design and dimensional data modeling.
  • Strong understanding of Hadoop Architecture, HDFS ecosystem and eexperience with Big Data technology stack such as HBase, Hadoop, Hive, MapReduce.
  • Experience in designing systems that process structured as well as unstructured data at large scale.
  • Experience in AWS/Spark/Java/Scala/Python development.
  • Should have Strong skills in PySpark (Python & SPARK). Ability to create, manage and manipulate Spark Dataframes. Expertise in Spark query tuning and performance optimization.
  • Experience in developing efficient software code/frameworks for multiple use cases leveraging Python and big data technologies.
  • Prior exposure to streaming data sources such as Kafka.
  • Should have knowledge on Shell Scripting and Python scripting.
  • High proficiency in database skills (e.g., Complex SQL), for data preparation, cleaning, and data wrangling/munging, with the ability to write advanced queries and create stored procedures.
  • Experience with NoSQL databases such as Cassandra / MongoDB.
  • Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission.
  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development).
  • Experience building and deploying applications on on-premise and cloud-based infrastructure.
  • Having a good understanding of machine learning landscape and concepts. 

 

Qualifications and Experience:

Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Big Data Engineer or a similar role for 3-5 years.

Certifications:

Good to have at least one of the Certifications listed here:

    AZ 900 - Azure Fundamentals

    DP 200, DP 201, DP 203, AZ 204 - Data Engineering

    AZ 400 - Devops Certification

Read more
IMPACTREE DATA TECHNOLOGIES
Rajashri sai
Posted by Rajashri sai
Remote, Chennai, Mumbai, Bengaluru (Bangalore)
2 - 7 yrs
₹2.4L - ₹3.6L / yr
skill iconJava
Software Development
Data Structures
Algorithms
skill iconScala
+14 more
Impactree Data Technologies is a young social enterprise working to support social organisations scale development programmes using technology of data monitoring and analytics on a real time basis. Impactree specifically works at building data intellegence for two specific sectors - rural livelihoods and education. Our team consists of members qualified from London Business School, IIM Khozikode, ICAI, Anna University etc.

We are an upcoming profitable social enterprise and as a a part of the team we are looking for a candidate who can work with our team to build better analytics and intellegence into our platform Prabhaav. 

We are looking for a Software Developer to build and implement functional programs. You will work with other Developers and https://resources.workable.com/product-manager-job-description">Product Managers throughout the software development life cycle.


In this role, you should be a team player with a keen eye for detail and problem-solving skills. If you also have experience in Agile frameworks and popular coding languages (e.g. JavaScript).


Your goal will be to build efficient programs and systems that serve user needs. 

Technical Skills we are looking for are:

 

  • Producing clean, efficient code based on specifications
  • Coding Abilities in HTML , PHP , JS , JSP – Server let , JAVA , DevOps(basic Knowledge).
  • Additional Skills (preferred) : NodeJS , Python , Angular JS .
  • System Administrator Experience : Linux (Ubuntu/RedHat) , Windows CE-Embedded.
  • Data Base Experience : MySQL , Posgres , Mongo DB.
  • Data Format Experience : JSON , XML , AJAX , JQuery.
  • Should have Depth in software Architecture Design especially for Stand-Alone Software As Product , or SaaS Platform Experience.
  • Should have Basic Experience/knowledge in Micro-Services , Rest API’s and SOAP methodologies.
  • Should have built some backend architecture for Long Standing Applications.
  • Good HTML Design Sense.
  • Experience with AWS Services like EC2 and LightSail is Preferred.
  • Testing and deploying programs and systems
  • Fixing and improving existing software
  • Good Understanding of OOP’s and Similar Concepts.
  • Research on New JS Methodologies like React Js and Angular Js

 

Experience areas we are looking for: 
  • Proven experience as a Software Developer, https://resources.workable.com/software-engineer-job-description">Software Engineeror similar role
  • Familiarity with Agile development methodologies
  • Experience with software design and development in a test-driven environment
  • Knowledge of coding languages (e.g. Java, JavaScript) and frameworks/systems (e.g. AngularJS, Git)
  • Experience with databases and Object-Relational Mapping (ORM) frameworks (e.g. Hibernate)
  • Ability to learn new languages and technologies
  • Excellent communication skills
  • Resourcefulness and troubleshooting aptitude
  • Attention to detail

 

Read more
Maveric Systems

at Maveric Systems

3 recruiters
Rashmi Poovaiah
Posted by Rashmi Poovaiah
Bengaluru (Bangalore), Chennai, Pune
4 - 10 yrs
₹8L - ₹15L / yr
Big Data
Hadoop
Spark
Apache Kafka
HiveQL
+2 more

Role Summary/Purpose:

We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.

 

Requirements:

  • The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
  • Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
  • Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
  • Excellent knowledge in SQL & Linux Shell scripting
  • Bachelors/Master’s/Engineering Degree from a well-reputed university.
  • Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
  • Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
  • Ability to manage a diverse and challenging stakeholder community
  • Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.

 

Responsibilities

  • Should works as a senior developer/individual contributor based on situations
  • Should be part of SCRUM discussions and to take requirements
  • Adhere to SCRUM timeline and deliver accordingly
  • Participate in a team environment for the design, development and implementation
  • Should take L3 activities on need basis
  • Prepare Unit/SIT/UAT testcase and log the results
  • Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
  • Quality delivery and automation should be a top priority
  • Co-ordinate change and deployment in time
  • Should create healthy harmony within the team
  • Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
vandana chauhan
Posted by vandana chauhan
Remote, Chennai
3 - 7 yrs
₹12L - ₹18L / yr
Big Data
skill iconAmazon Web Services (AWS)
Hadoop
SQL
skill iconPython
+5 more
Position: Data Engineer  
Location: Chennai- Guindy Industrial Estate
Duration: Full time role
Company: Mobile Programming (https://www.mobileprogramming.com/" target="_blank">https://www.mobileprogramming.com/) 
Client Name: Samsung 


We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be
responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing
data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline
builder and data wrangler who enjoy optimizing data systems and building them from the ground up.
The Data Engineer will support our software developers, database architects, data analysts and data
scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout
ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple
teams, systems and products.

Responsibilities for Data Engineer
 Create and maintain optimal data pipeline architecture,
 Assemble large, complex data sets that meet functional / non-functional business requirements.
 Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
 Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
 Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
 Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
 Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
 Work with data and analytics experts to strive for greater functionality in our data systems.

Qualifications for Data Engineer
 Experience building and optimizing big data ETL pipelines, architectures and data sets.
 Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
 Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
 Strong analytic skills related to working with unstructured datasets.
 Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
 A successful history of manipulating, processing and extracting value from large disconnected
datasets.

 Working knowledge of message queuing, stream processing and highly scalable ‘big datadata
stores.
 Strong project management and organizational skills.
 Experience supporting and working with cross-functional teams in a dynamic environment.

We are looking for a candidate with 3-6 years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
 Experience with big data tools: Spark, Kafka, HBase, Hive etc.
 Experience with relational SQL and NoSQL databases
 Experience with AWS cloud services: EC2, EMR, RDS, Redshift
 Experience with stream-processing systems: Storm, Spark-Streaming, etc.
 Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.

Skills: Big Data, AWS, Hive, Spark, Python, SQL
 
Read more
ByteAlly

at ByteAlly

1 recruiter
Anand Sukumaran
Posted by Anand Sukumaran
Chennai
1 - 4 yrs
₹4L - ₹8L / yr
skill iconHaskell
skill iconScala
OCaml
We are looking for Haskell developers to join our team. We are a leader in Haskell consulting and one of the very few companies based out of Asia to provide development services in Haskell. We have made several contributions to the Haskell world by open-sourcing our efforts as libraries and frameworks.

We are looking for someone who:
- has 1+ years of professional software engineering experience.
- is familiar with the paradigms of functional programming
- is familiar with at least one FP language (Haskell, Scala, OCaml, etc.)
- has the ability to participate in entire product development life-cycle
- is self motivated to learn new technologies
- can play leadership role in terms of coming up with solutions
Read more
SS Consulting

at SS Consulting

1 recruiter
Sriram Sridhar
Posted by Sriram Sridhar
Chennai
0 - 1 yrs
₹1L - ₹2L / yr
skill iconC++
skill iconJava
Data Structures
Algorithms
skill iconScala
+2 more
Design, develop and maintain applications according to specifications. Interact with client on requirement. Should be flexible to work in more than one platform depending on requirement.
Read more
Intelliswift Software

at Intelliswift Software

12 recruiters
Pratish Mishra
Posted by Pratish Mishra
Chennai
4 - 8 yrs
₹8L - ₹17L / yr
Big Data
Spark
skill iconScala
SQL
Greetings from Intelliswift! Intelliswift Software Inc. is a premier software solutions and Services Company headquartered in the Silicon Valley, with offices across the United States, India, and Singapore. The company has a proven track record of delivering results through its global delivery centers and flexible engagement models for over 450 brands ranging from Fortune 100 to growing companies. Intelliswift provides a variety of services including Enterprise Applications, Mobility, Big Data / BI, Staffing Services, and Cloud Solutions. Growing at an outstanding rate, it has been recognized as the second largest private IT Company in the East Bay. Domains: IT, Retail, Pharma, Healthcare, BFSI, and Internet & E-commerce website https://www.intelliswift.com/ Experience: 4-8 Years Job Location: Chennai Job Description: Skills: Spark, Scala, Big data, Hive · Strong Working experience in Spark, Scala, big data, h base and hive. · Should have good working experience in SQL and Spark SQL. · Good to have knowledge or experience in Teradata. · Familiar with General engineering Git, jenkins, sbt, maven.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort