6+ ETL Jobs in Ahmedabad | ETL Job openings in Ahmedabad
Apply to 6+ ETL Jobs in Ahmedabad on CutShort.io. Explore the latest ETL Job opportunities across top companies like Google, Amazon & Adobe.
at Simform
Company Description :
Simform is a product engineering company founded in 2010 that helps organizations ranging from startups to Fortune 500 companies and progressive enterprises. Their remote agile teams of engineers are focused on identifying and solving critical business challenges with proven technology practices such as DevOps, cloud-native development, and quality engineering services. Simform's mission is to solve complex software engineering problems and make organizations more competitive and agile.
Simform is a CMMI L-3, AWS Premier Partner, and Azure Solutions Partner. I think you'll be excited to know more about what we have to offer.
If you're open to learning more about Simform Solutions and the Senior Asp.net Developer position, kindly go through our company profile and job description for your reference.
Skills :
Experience: 4+ Years
Location: Ahmedabad, Gujarat
Mandatory skills: Python3, Django, Microservices, Any cloud(AWS/Azure/GCP), Docker, Design Pattern
Good-to-have skills: DevOps, Kubernetes, Team leading, Front-end
Why Simform? Some of the perks and benefits of working at Simform are :
- Flat-hierarchical, friendly, engineering-oriented, and growth-focused culture.
- Flexible work timing, Leaves for life events, work from home
- Free health insurance
- Office facility with large, fully-equipped game-zone, in-office kitchen, affordable lunch service, and free snacks.
- Sponsorship for certifications/events, library service
Role Description :
- Collaborate with clients and project teams to understand business requirements and develop efficient, high-quality code that meets or exceeds client expectations.
- Optimize application performance for smooth operation on multiple delivery platforms, including cloud environments like AWS, Azure, or GCP.
- Design and implement low-latency, high-availability, and performant applications using frameworks such as Django, Flask, or FastAPI.
- Lead the integration of user interface elements developed by front-end developers with server-side logic.
- Integrate multiple data sources and databases into one system, ensuring proper integration of data storage and third-party libraries/packages into the application.
- Create scalable and optimized database schemas tailored to specific business logic and handle large volumes of data from databases or over HTTP(S)/WebSockets.
- Conduct thorough testing using pytest and unittest, and perform debugging to ensure applications are bug-free and run smoothly.
- Provide mentorship and guidance to junior developers on the team.
- Communicate effectively with clients to understand their needs and provide updates on project progress.
Skills and Qualifications:
- 3+ years of experience as a Python developer with strong client communication skills and team-leading experience.
- In-depth knowledge of different Python frameworks such as Django, Flask, or FastAPI.
- Strong knowledge of cloud technologies, particularly AWS, Azure, or GCP.
- Deep understanding of microservices architecture, multi-tenant architecture, and best practices in Python development.
- Familiarity with serverless architecture and frameworks such as AWS Lambda or Azure Functions.
- Experience with deployment using Docker, Nginx, Gunicorn, Uvicorn, Supervisor, Docker.
- Hands on experience with SQL and NoSQL database as PostgreSQL , AWS DynamoDB
- Understanding of different types of Object Relational Mappers (ORMs) including SQLAlchemy, Django ORM.
- Demonstrated ability to handle multiple API integrations and write modular, reusable code.
- Experience with frontend technologies and frameworks like React, Vue or HTML, CSS, JS enhancing full-stack development capabilities.
- Solid understanding of user authentication and authorization mechanisms across multiple systems and environments.
- Familiar with fundamental design principles for scalable applications and proficient in object event-driven programming in Python.
- Strong skills in unit testing, debugging, and code optimization.
- Experience with modern software development methodologies, including Agile and Scrum.
- Familiarity with container orchestration tools such as Kubernetes.
- Understanding of data processing frameworks such as Apache Kafka, Spark. ( Good to have )
- Experience with CI/CD pipelines and automation tools like Jenkins, GitLab CI, or CircleCI.
Technical Skills:
- Ability to understand and translate business requirements into design.
- Proficient in AWS infrastructure components such as S3, IAM, VPC, EC2, and Redshift.
- Experience in creating ETL jobs using Python/PySpark.
- Proficiency in creating AWS Lambda functions for event-based jobs.
- Knowledge of automating ETL processes using AWS Step Functions.
- Competence in building data warehouses and loading data into them.
Responsibilities:
- Understand business requirements and translate them into design.
- Assess AWS infrastructure needs for development work.
- Develop ETL jobs using Python/PySpark to meet requirements.
- Implement AWS Lambda for event-based tasks.
- Automate ETL processes using AWS Step Functions.
- Build data warehouses and manage data loading.
- Engage with customers and stakeholders to articulate the benefits of proposed solutions and frameworks.
Qualifications :
- Minimum 2 years of .NET development experience (ASP.Net 3.5 or greater and C# 4 or greater).
- Good knowledge of MVC, Entity Framework, and Web API/WCF.
- ASP.NET Core knowledge is preferred.
- Creating APIs / Using third-party APIs
- Working knowledge of Angular is preferred.
- Knowledge of Stored Procedures and experience with a relational database (MSSQL 2012 or higher).
- Solid understanding of object-oriented development principles
- Working knowledge of web, HTML, CSS, JavaScript, and the Bootstrap framework
- Strong understanding of object-oriented programming
- Ability to create reusable C# libraries
- Must be able to write clean comments, readable C# code, and the ability to self-learn.
- Working knowledge of GIT
Qualities required :
Over above tech skill we prefer to have
- Good communication and Time Management Skill.
- Good team player and ability to contribute on a individual basis.
- We provide the best learning and growth environment for candidates.
Skills:
NET Core
.NET Framework
ASP.NET Core
ASP.NET MVC
ASP.NET Web API
C#
HTML
We are a fast-growing digital, cloud, and mobility services provider with a principal market being North
America. We are looking for talented database/SQL experts for the management and analytics of large
data in various enterprise projects.
Responsibilities
Translate business needs to technical specifications
Manage and maintain various database servers (backup, replicas, shards, jobs)
Develop and execute database queries and conduct analyses
Occasionally write scripts for ETL jobs.
Create tools to store data (e.g. OLAP cubes)
Develop and update technical documentation
Requirements
Proven experience as a database programmer and administrator
Background in data warehouse design (e.g. dimensional modeling) and data mining
Good understanding of SQL and NoSQL databases, online analytical processing (OLAP) and ETL
(Extract, transform, load) framework
Advance Knowledge of SQL queries, SQL Server Reporting Services (SSRS) and SQL Server
Integration Services (SSIS)
Familiarity with BI technologies (strong Tableu hands-on experience) is a plus
Analytical mind with a problem-solving aptitude
Job Description
Mandatory Requirements
-
Experience in AWS Glue
-
Experience in Apache Parquet
-
Proficient in AWS S3 and data lake
-
Knowledge of Snowflake
-
Understanding of file-based ingestion best practices.
-
Scripting language - Python & pyspark
CORE RESPONSIBILITIES
-
Create and manage cloud resources in AWS
-
Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
-
Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
-
Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
-
Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
-
Define process improvement opportunities to optimize data collection, insights and displays.
-
Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
-
Identify and interpret trends and patterns from complex data sets
-
Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
-
Key participant in regular Scrum ceremonies with the agile teams
-
Proficient at developing queries, writing reports and presenting findings
-
Mentor junior members and bring best industry practices.
QUALIFICATIONS
-
5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
-
Strong background in math, statistics, computer science, data science or related discipline
-
Advanced knowledge one of language: Java, Scala, Python, C#
-
Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
-
Proficient with
-
Data mining/programming tools (e.g. SAS, SQL, R, Python)
-
Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
-
Data visualization (e.g. Tableau, Looker, MicroStrategy)
-
Comfortable learning about and deploying new technologies and tools.
-
Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
-
Good written and oral communication skills and ability to present results to non-technical audiences
-
Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
-
AWS certification
-
Spark Streaming
-
Kafka Streaming / Kafka Connect
-
ELK Stack
-
Cassandra / MongoDB
-
CI/CD: Jenkins, GitLab, Jira, Confluence other related tools
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.
Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.