4+ EMR Jobs in Pune | EMR Job openings in Pune
Apply to 4+ EMR Jobs in Pune on CutShort.io. Explore the latest EMR Job opportunities across top companies like Google, Amazon & Adobe.
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
at Wissen Technology
Responsibilities:
- Design, implement, and maintain scalable and reliable database solutions on the AWS platform.
- Architect, deploy, and optimize DynamoDB databases for performance, scalability, and cost-efficiency.
- Configure and manage AWS OpenSearch (formerly Amazon Elasticsearch Service) clusters for real-time search and analytics capabilities.
- Design and implement data processing and analytics solutions using AWS EMR (Elastic MapReduce) for large-scale data processing tasks.
- Collaborate with cross-functional teams to gather requirements, design database solutions, and implement best practices.
- Perform performance tuning, monitoring, and troubleshooting of database systems to ensure high availability and performance.
- Develop and maintain documentation, including architecture diagrams, configurations, and operational procedures.
- Stay current with the latest AWS services, database technologies, and industry trends to provide recommendations for continuous improvement.
- Participate in the evaluation and selection of new technologies, tools, and frameworks to enhance database capabilities.
- Provide guidance and mentorship to junior team members, fostering knowledge sharing and skill development.
Requirements:
- Bachelor’s degree in computer science, Information Technology, or related field.
- Proven experience as an AWS Architect or similar role, with a focus on database technologies.
- Hands-on experience designing, implementing, and optimizing DynamoDB databases in production environments.
- In-depth knowledge of AWS OpenSearch (Elasticsearch) and experience configuring and managing clusters for search and analytics use cases.
- Proficiency in working with AWS EMR (Elastic MapReduce) for big data processing and analytics.
- Strong understanding of database concepts, data modelling, indexing, and query optimization.
- Experience with AWS services such as S3, EC2, RDS, Redshift, Lambda, and CloudFormation.
- Excellent problem-solving skills and the ability to troubleshoot complex database issues.
- Solid understanding of cloud security best practices and experience implementing security controls in AWS environments.
- Strong communication and collaboration skills with the ability to work effectively in a team environment.
- AWS certifications such as AWS Certified Solutions Architect, AWS Certified Database - Specialty, or equivalent certifications are a plus.
Experience - 4 – 6 years
About the role –
The Product Manager role will report to the CEO and will build the product from 0 to 1.
Responsibilities
Need to work with the senior management team to work on product roadmap
Work with tech and management to create and execute product pipeline
Own the successful delivery of your roadmap and co-own the success of the overall products
Define requirements (JIRA) - Creating EPICs, user stories. Tracking the sprint
Effectively identify and manage cross-team dependencies
Ensure delivery timelines
Required Knowledge and Skills
Should have good understanding of eligibility, Prior authorization workflows
worked as product lead/lead BA
Worked on eligibility implementation (API Integration) ( Waystar, Change healthcare, Claim MD etc.)
Should have experience in building and scaling SaaS tools from the ground up.
Excellent communication and collaboration skills
Use common tools to create mockups, stories, and roadmaps Preferred Qualifications
3+ Years of product management experience, including cloud/SaaS product
US Healthcare Experience
Value added - If worked on X12 270, 271
Basic Qualifications
Bachelor’s engineering degree from Tier 1 institute
MBA from Tier 1 (good to have)
3 Years of product management experience, including cloud/SaaS product
Company Profile:
Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.
We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.
Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.
Salary: As per company standards.
Designation: Data Engineering
Location: Pune
Experience with ETL, Data Modeling, and Data Architecture
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.
Experience with AWS cloud data lake for development of real-time or near real-time use cases
Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing
Build data pipeline frameworks to automate high-volume and real-time data delivery
Create prototypes and proof-of-concepts for iterative development.
Experience with NoSQL databases, such as DynamoDB, MongoDB etc
Create and maintain optimal data pipeline architecture,
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow
Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.
Employment Type
Full-time