Cutshort logo
Apache synapse jobs

15+ Apache Synapse Jobs in India

Apply to 15+ Apache Synapse Jobs on CutShort.io. Find your next job, effortlessly. Browse Apache Synapse Jobs and apply today!

icon
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
7 - 14 yrs
₹20L - ₹30L / yr
Solution architecture
skill iconData Analytics
Data architecture
Data Warehouse (DWH)
Enterprise Data Warehouse (EDW)
+21 more

As a Solution Architect, you will collaborate with our sales, presales and COE teams to provide technical expertise and support throughout the new business acquisition process. You will play a crucial role in understanding customer requirements, presenting our solutions, and demonstrating the value of our products.


You thrive in high-pressure environments, maintaining a positive outlook and understanding that career growth is a journey that requires making strategic choices. You possess good communication skills, both written and verbal, enabling you to convey complex technical concepts clearly and effectively. You are a team player, customer-focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You must have experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids. You possess a strong work ethic, positive attitude, and enthusiasm to embrace new challenges. You can multi-task and prioritize (good time management skills), willing to display and learn. You should be able to work independently with less or no supervision. You should be process-oriented, have a methodical approach and demonstrate a quality-first approach.


Ability to convert client’s business challenges/ priorities into winning proposal/ bid through excellence in technical solution will be the key performance indicator for this role.


What you’ll do

  • Architecture & Design: Develop high-level architecture designs for scalable, secure, and robust solutions.
  • Technology Evaluation: Select appropriate technologies, frameworks, and platforms for business needs.
  • Cloud & Infrastructure: Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP.
  • Integration: Ensure seamless integration between various enterprise applications, APIs, and third-party services.
  • Design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platform like MS Fabric.
  • Translate business needs into technical solutions by designing secure, scalable, and performant data architectures on cloud platforms.
  • Select and recommend appropriate Data services (e.g. Fabric, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Power BI etc) to meet specific data storage, processing, and analytics needs.
  • Develop and recommend data models that optimize data access and querying. Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL/ELT) processes.
  • Ability to understand Conceptual/Logical/Physical Data Modelling.
  • Choose and implement appropriate data storage, processing, and analytics services based on specific data needs (e.g., data lakes, data warehouses, data pipelines).
  • Understand and recommend data governance practices, including data lineage tracking, access control, and data quality monitoring.



What you will Bring 

  • 10+ years of working in data analytics and AI technologies from consulting, implementation and design perspectives
  • Certifications in data engineering, analytics, cloud, AI will be a certain advantage
  • Bachelor’s in engineering/ technology or an MCA from a reputed college is a must
  • Prior experience of working as a solution architect during presales cycle will be an advantage


Soft Skills

  • Communication Skills
  • Presentation Skills
  • Flexible and Hard-working


Technical Skills

  • Knowledge of Presales Processes
  • Basic understanding of business analytics and AI
  • High IQ and EQ


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.
Read more
Cloudesign Technology Solutions
Remote only
5 - 13 yrs
₹15L - ₹28L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+8 more

Job Description: Data Engineer

Location: Remote

Experience Required: 6 to 12 years in Data Engineering

Employment Type: [Full-time]

Notice: Looking for candidates, Who can join immediately or 15days Max

 

About the Role:

We are looking for a highly skilled Data Engineer with extensive experience in Python, Databricks, and Azure services. The ideal candidate will have a strong background in building and optimizing ETL processes, managing large-scale data infrastructures, and implementing data transformation and modeling tasks.

 

Key Responsibilities:

ETL Development:

Use Python as an ETL tool to read data from various sources, perform data type transformations, handle errors, implement logging mechanisms, and load data into Databricks-managed delta tables.

Develop robust data pipelines to support analytics and reporting needs.

Data Transformation & Optimization:

Perform data transformations and evaluations within Databricks.

Work on optimizing data workflows for performance and scalability.

Azure Expertise:

Implement and manage Azure services, including Azure SQL Database, Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake.

Coding & Development:

Utilize Python for complex tasks involving classes, objects, methods, dictionaries, loops, packages, wheel files, and database connectivity.

Write scalable and maintainable code to manage streaming and batch data processing.

Cloud & Infrastructure Management:

Leverage Spark, Scala, and cloud-based solutions to design and maintain large-scale data infrastructures.

Work with cloud data warehouses, data lakes, and storage formats.

Project Leadership:

Lead data engineering projects and collaborate with cross-functional teams to deliver solutions on time.


Required Skills & Qualifications:

Technical Proficiency:

  • Expertise in Python for ETL and data pipeline development.
  • Strong experience with Databricks and Apache Spark.
  • Proven skills in handling Azure services, including Azure SQL Database, Azure Data Factory, Azure Synapse Analytics, and Azure Data Lake.


Experience & Knowledge:

  • Minimum 6+ years of experience in data engineering.
  • Solid understanding of data modeling, ETL processes, and optimizing data pipelines.
  • Familiarity with Unix shell scripting and scheduling tools.

Other Skills:

  • Knowledge of cloud warehouses and storage formats.
  • Experience in handling large-scale data infrastructures and streaming data.

 

Preferred Qualifications:

  • Proven experience with Spark and Scala for big data processing.
  • Prior experience in leading or mentoring data engineering teams.
  • Hands-on experience with end-to-end project lifecycle in data engineering.

 

What We Offer:

  • Opportunity to work on challenging and impactful data projects.
  • A collaborative and innovative work environment.
  • Competitive compensation and benefits.

 

How to Apply:

https://cloudesign.keka.com/careers/jobdetails/73555

Read more
Cornertree

at Cornertree

1 recruiter
Deepesh Shrimal
Posted by Deepesh Shrimal
Gurugram
4 - 9 yrs
₹7L - ₹35L / yr
PySpark
Data engineering
Big Data
Hadoop
Spark
+5 more

This requirement is for Data Engineer in Gurugram for Data Analytics Project. 

 Building ETL/ELT pipelines of data from various sources using SQL/Python/Spark 

 Ensuring that data are modelled and processed according to architecture and requirements both 

functional and non-functional 

 Understanding and implementing required development guidelines, design standards and best 

practices Delivering right solution architecture, automation and technology choices 

 Working cross-functionally with enterprise architects, information security teams, and platform 

teams Suggesting and implementing architecture improvements 

 Experience with programming languages such as Python or Scala 

 Knowledge of Data Warehouse, Business Intelligence and ETL/ELT data processing issues 

 Ability to create and orchestrate ETL/ELT processes in different tools (ADF, Databricks 

Workflows) Experience working with Databricks platform: workspace, delta lake, workflows, jobs, 

Unity Catalog Understanding of SQL and relational databases 

 Practical knowledge of various relational and non-relational database engines in the cloud (Azure 

SQL Database, Azure Cosmos DB, Microsoft Fabric, Databricks) 

 Hands-on experience with data services offered by Azure cloud 

 Knowledge of Apache Spark (Databricks, Azure Synapse Spark Pools) 

 Experience in performing code review of ETL/ELT pipelines and SQL queries Analytical approach 

to problem solvin

Read more
Amazech Systems pvt Ltd
Remote only
5 - 7 yrs
₹8L - ₹13L / yr
ADF
Apache Synapse
SSIS
SQL
ETL
+11 more

 Hiring for Azure Data Engineers.

Location: Bangalore

Employment type: Full-time, permanent

website: www.amazech.com

 

Qualifications: 

B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.


Experience and Required Skill Sets:


•       Minimum 5 years of hands-on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse, Azure Blob, Azure Storage Explorer

•       Experience in Data warehouse/analytical systems using Azure Synapse.

Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development, Synapse, etc.

•       Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, and Purview.

•       Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI

•       Design and develop batch and real-time streaming of data loads to data warehouse systems

 

 Other Requirements:


A Bachelor's or Master's degree (Engineering or computer-related degree preferred)

Strong understanding of Software Development Life Cycles including Agile/Scrum


Responsibilities: 

•       Ability to create complex, enterprise-transforming applications that meet and exceed client expectations. 

•       Responsible for the bottom line. Strong project management abilities. Ability to encourage the team to stick to timelines.

Read more
[x]cube LABS

at [x]cube LABS

2 candid answers
1 video
Krishna kandregula
Posted by Krishna kandregula
Hyderabad
2 - 6 yrs
₹8L - ₹20L / yr
ETL
Informatica
Data Warehouse (DWH)
PowerBI
DAX
+12 more
  • Creating and managing ETL/ELT pipelines based on requirements
  • Build PowerBI dashboards and manage datasets needed.
  • Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
  • Build data cubes for real-time visualisation needs and CXO dashboards.


Required Tech Skills


  • Microsoft PowerBI & DAX
  • Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
  • Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory



Read more
We deliver business outcomes by stitching technology, experi

We deliver business outcomes by stitching technology, experi

Agency job
via Startup Login by Ambika Marathe
Hyderabad
5 - 10 yrs
₹19L - ₹25L / yr
ETL
Informatica
Data Warehouse (DWH)
Windows Azure
Microsoft Windows Azure
+4 more

A Business Transformation Organization that partners with businesses to co–create customer-centric hyper-personalized solutions to achieve exponential growth. Invente offers platforms and services that enable businesses to provide human-free customer experience, Business Process Automation.


Location: Hyderabad (WFO)

Budget: Open

Position: Azure Data Engineer

Experience: 5+ years of commercial experience


Responsibilities

●     Design and implement Azure data solutions using ADLS Gen 2.0, Azure Data Factory, Synapse, Databricks, SQL, and Power BI

●     Build and maintain data pipelines and ETL processes to ensure efficient data ingestion and processing

●     Develop and manage data warehouses and data lakes

●     Ensure data quality, integrity, and security

●     Implement from existing use cases required by the AI and analytics teams.

●     Collaborate with other teams to integrate data solutions with other systems and applications

●     Stay up-to-date with emerging data technologies and recommend new solutions to improve our data infrastructure


Read more
Exponentia.ai

at Exponentia.ai

1 product
1 recruiter
Vipul Tiwari
Posted by Vipul Tiwari
Mumbai
7 - 10 yrs
₹13L - ₹19L / yr
Project Management
IT project management
Software project management
Business Intelligence (BI)
Data Warehouse (DWH)
+8 more

Role: Project Manager

Experience: 8-10 Years

Location: Mumbai


Company Profile:



Exponentia.ai is an AI tech organization with a presence across India, Singapore, the Middle East, and the UK. We are an innovative and disruptive organization, working on cutting-edge technology to help our clients transform into the enterprises of the future. We provide artificial intelligence-based products/platforms capable of automated cognitive decision-making to improve productivity, quality, and economics of the underlying business processes. Currently, we are rapidly expanding across machine learning, Data Engineering and Analytics functions. Exponentia.ai has developed long-term relationships with world-class clients such as PayPal, PayU, SBI Group, HDFC Life, Kotak Securities, Wockhardt and Adani Group amongst others.

One of the top partners of Data bricks, Azure, Cloudera (leading analytics player) and Qlik (leader in BI technologies), Exponentia.ai has recently been awarded the ‘Innovation Partner Award’ by Qlik and "Excellence in Business Process Automation Award" (IMEA) by Automation Anywhere.

Get to know more about us at http://www.exponentia.ai and https://in.linkedin.com/company/exponentiaai 


Role Overview:


·        Project manager shall be responsible to oversee and take responsibility for the successful delivery of a range of projects in Business Intelligence, Data warehousing, and Analytics/AI-ML.

·        Project manager is expected to manage the project and lead the teams of BI engineers, data engineers, data scientists and application developers.


Job Responsibilities:


·        Efforts estimation, creating a project plan, planning milestones, activities and tracking the progress.

·        Identify risks and issues. Come up with a mitigation plan.

·        Status reporting to both internal and external stakeholders.

·        Communicate with all stakeholders.

·        Manage end-to-end project lifecycle - requirements gathering, design, development, testing and go-live.

·        Manage end-to-end BI or data warehouse projects.

·        Must have experience in running Agile-based project development.


Technical skills


·        Experience in Business Intelligence Data warehousing or Analytics projects.

·        Understand data lake and data warehouse solutions including ETL pipelines.

·        Good to have - Knowledge of Azure blob storage, azure data factory and Synapse analytics.

·        Good to have - Knowledge of Qlik Sense or Power BI

·        Good to have - Certified in PMP/Prince 2 / Agile Project management.

·        Excellent written and verbal communication skills.


 Education:

MBA, B.E. or B. Tech. or MCA degree

Read more
Consulting & Implementation Services Data Analytic & EPM

Consulting & Implementation Services Data Analytic & EPM

Agency job
via Merito by Sana Patel
Noida
5 - 10 yrs
₹15L - ₹30L / yr
Windows Azure
IT infrastructure
Apache Synapse
Azure Data Factory
Data Lake
Hi,

About  the company:

Our Client enables enterprises in their digital transformation journey by offering Consulting & Implementation Services related to Data Analytics &Enterprise Performance Management (EPM).

Job Location : Noida

Position – Azure Solution Architect

Notice period- Immediate to 60 days

Experience –  6+

Job Description:

Your Role and Responsibilities

  • Able to drive the technology design meetings, propose technology design and architecture
  • Experienced in the design and delivery of enterprise level Highly Available solutions
  • Work closely with project management teams to successfully monitor progress of implementation
  • Collaborate with Pre-sales team on RFP
  • Provide detailed specifications for proposed solutions
  • Experienced to Migrate applications to cloud
  • Experienced to create Data Lake and Data warehouse solutions
  • Ability to implement the solution as per technical requirements
  • Identity, authentication, security, privacy, and compliance including Active Directory modern Application Architecture (Queue’s, Micro-Services, Containers etc)

Required Technical and Professional Expertise

  • Project management and leadership skills are essential.
  • 4+ years of experience developing IT and cloud infrastructure (MS Azure, GPC).
  • Working knowledge of MS Azure technology stack and related technology (ie. Data Factory, Data Flow, Synapse Analytics, Synapse ML, Gen2 storage, etc.).
  • Master's degree in Computer Science or Software Engineering preferred.
  • Current understanding of best practices regarding system security measures.
  • Experience in software engineering and design architecture.
  • Positive outlook in meeting challenges and working to a high level.
  • Advanced understanding of business analysis techniques and processes.
  • Good to have Azure ML experience

Preferred Technical and Professional Experience

  • MS Azure Certification: Fundamentals, Solution Architect, Data Engineer
Regards
Team Merito
Read more
GradMener Technology Pvt. Ltd.
Pune, Chennai
5 - 9 yrs
₹15L - ₹20L / yr
skill iconScala
PySpark
Spark
SQL Azure
Hadoop
+4 more
  • 5+ years of experience in a Data Engineering role on cloud environment
  • Must have good experience in Scala/PySpark (preferably on data-bricks environment)
  • Extensive experience with Transact-SQL.
  • Experience in Data-bricks/Spark.
  • Strong experience in Dataware house projects
  • Expertise in database development projects with ETL processes.
  • Manage and maintain data engineering pipelines
  • Develop batch processing, streaming and integration solutions
  • Experienced in building and operationalizing large-scale enterprise data solutions and applications
  • Using one or more of Azure data and analytics services in combination with custom solutions
  • Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
  • In-depth understanding of data management (e. g. permissions, security, and monitoring).
  • Cloud repositories for e.g. Azure GitHub, Git
  • Experience in an agile environment (Prefer Azure DevOps).

Good to have

  • Manage source data access security
  • Automate Azure Data Factory pipelines
  • Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
  • Experience in implementing and maintaining CICD pipelines
  • Power BI understanding, Delta Lake house architecture
  • Knowledge of software development best practices.
  • Excellent analytical and organization skills.
  • Effective working in a team as well as working independently.
  • Strong written and verbal communication skills.
  • Expertise in database development projects and ETL processes.
Read more
a global business process management company

a global business process management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
3 - 8 yrs
₹14L - ₹20L / yr
Business Intelligence (BI)
PowerBI
Windows Azure
skill iconGit
SVN
+9 more

Power BI Developer(Azure Developer )

Job Description:

Senior visualization engineer with understanding in Azure Data Factory & Databricks to develop and deliver solutions that enable delivery of information to audiences in support of key business processes.

Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business and technical counterparts.

 

Desired Competencies:

  • Strong designing concepts of data visualization centered on business user and a knack of communicating insights visually.
  • Ability to produce any of the charting methods available with drill down options and action-based reporting. This includes use of right graphs for the underlying data with company themes and objects.
  • Publishing reports & dashboards on reporting server and providing role-based access to users.
  • Ability to create wireframes on any tool for communicating the reporting design.
  • Creation of ad-hoc reports & dashboards to visually communicate data hub metrics (metadata information) for top management understanding.
  • Should be able to handle huge volume of data from databases such as SQL Server, Synapse, Delta Lake or flat files and create high performance dashboards.
  • Should be good in Power BI development
  • Expertise in 2 or more BI (Visualization) tools in building reports and dashboards.
  • Understanding of Azure components like Azure Data Factory, Data lake Store, SQL Database, Azure Databricks
  • Strong knowledge in SQL queries
  • Must have worked in full life-cycle development from functional design to deployment
  • Intermediate understanding to format, process and transform data
  • Should have working knowledge of GIT, SVN
  • Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
  • Basic understanding of data modelling and ability to combine data from multiple sources to create integrated reports

 

Preferred Qualifications:

  • Bachelor's degree in Computer Science or Technology
  • Proven success in contributing to a team-oriented environment
Read more
Abu Dhabi, Dubai
6 - 12 yrs
₹18L - ₹25L / yr
PySpark
Big Data
Spark
Data Warehouse (DWH)
SQL
+2 more
Must-Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skill
 
 
Technology Skills (Good to Have):
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
  • Azure Synapse or Azure SQL data warehouse
  • Spark on Azure is available in HD insights and data bricks
Read more
Blenheim Chalcot IT Services India Pvt Ltd

Blenheim Chalcot IT Services India Pvt Ltd

Agency job
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Indium Software

at Indium Software

16 recruiters
Mohammed Shabeer
Posted by Mohammed Shabeer
Remote only
2 - 3 yrs
₹5L - ₹8L / yr
skill iconData Analytics
data analyst
Apache Synapse
SQL
SAP MDG ( Master Data Governance)
+1 more
The Data Analyst in the CoE will provide end to end solution development, working in conjunction with the Domain Leads and Technology Partners. He is responsible for the delivery of solutions and solution changes which are driven by the business requirements as well as providing technical and development capabilities. Knowledge
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Evelyn Charles
Posted by Evelyn Charles
Remote, Bengaluru (Bangalore), Hyderabad, Chennai, Mumbai, Pune
8 - 15 yrs
₹16L - ₹28L / yr
PySpark
SQL Azure
azure synapse
Windows Azure
Azure Data Engineer
+3 more
Technology Skills:
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
Fragma Data Systems

at Fragma Data Systems

8 recruiters
Priyanka U
Posted by Priyanka U
Bengaluru (Bangalore)
8 - 10 yrs
₹16L - ₹28L / yr
SQL Azure
Azure synapse
Azure
Azure Data Architect
Spark
+4 more
Technology Skills:
 
  • Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
  • Experience in migrating on-premise data warehouses to data platforms on AZURE cloud. 
  • Designing and implementing data engineering, ingestion, and transformation functions
 
 
Good to Have: 
  • Experience with Azure Analysis Services
  • Experience in Power BI
  • Experience with third-party solutions like Attunity/Stream sets, Informatica
  • Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
  • Capacity Planning and Performance Tuning on Azure Stack and Spark.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort