2+ Hadoop Jobs in Ahmedabad | Hadoop Job openings in Ahmedabad
Apply to 2+ Hadoop Jobs in Ahmedabad on CutShort.io. Explore the latest Hadoop Job opportunities across top companies like Google, Amazon & Adobe.


Data Analytics Lead
Responsibilities:
· Oversee the design, development, and implementation of data analysis solutions to meet business needs.
· Work closely with business stakeholders and the Aviation SME to define data requirements, project scope, and deliverables.
· Drive the design and development of analytics data models and data warehouse designs.
· Develop and maintain data quality standards and procedures.
· Manage and prioritize data analysis projects, ensuring timely completion.
· Identify opportunities to improve data analysis processes and tools.
· Collaborate with Data Engineers and Data Architects to ensure data solutions align with the overall data platform architecture.
· Evaluate and recommend new data analysis tools and technologies.
· Contribute to the development of best practices for data analysis.
· Participate in project meetings and provide input on data-related issues, risks and requirements.
Qualifications
· 8+ years of experience as a Data Analytics Lead, with experience leading or mentoring a team.
· Extensive experience with cloud-based data modelling and data warehousing solutions, using Azure Data Bricks.
· Proven experience in data technologies and platforms, ETL processes and tools, preferably using Azure Data Factory, Azure Databricks (Spark), Delta Lake.
· Advanced proficiency in data visualization tools such as Power BI.
Data Analysis and Visualization:
- Experience in data analysis, statistical modelling, and machine learning techniques.
- Proficiency in analytical tools like Python, R, and libraries such as Pandas, NumPy for data analysis and modelling.
- Strong expertise in Power BI, Superset, Tablue for data visualization, data modelling, and DAX queries, with knowledge of best practices.
- Experience in implementing Row-Level Security in Power BI.
- Ability to work with medium-complex data models and quickly understand application data design and processes.
- Familiar with industry best practices for Power BI and experienced in performance optimization of existing implementations.
- Understanding of machine learning algorithms, including supervised, unsupervised, and deep learning techniques.
Data Handling and Processing:
- Proficient in SQL Server and query optimization.
- Expertise in application data design and process management.
- Extensive knowledge of data modelling.
- Hands-on experience with Azure Data Factory,Azure Databricks.
- Expertise in data warehouse development, including experience with SSIS (SQL Server Integration Services) and SSAS (SQL Server Analysis Services).
- Proficiency in ETL processes (data extraction, transformation, and loading), including data cleaning and normalization.
- Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka) for large-scale data processing.
Understanding of data governance, compliance, and security measures within Azure environments.
1. Communicate with the clients and understand their business requirements.
2. Build, train, and manage your own team of junior data engineers.
3. Assemble large, complex data sets that meet the client’s business requirements.
4. Identify, design and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
5. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, including the cloud.
6. Assist clients with data-related technical issues and support their data infrastructure requirements.
7. Work with data scientists and analytics experts to strive for greater functionality.
Skills required: (experience with at least most of these)
1. Experience with Big Data tools-Hadoop, Spark, Apache Beam, Kafka etc.
2. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
3. Experience in ETL and Data Warehousing.
4. Experience and firm understanding of relational and non-relational databases like MySQL, MS SQL Server, Postgres, MongoDB, Cassandra etc.
5. Experience with cloud platforms like AWS, GCP and Azure.
6. Experience with workflow management using tools like Apache Airflow.