
ob Description:
We are seeking an experienced Azure Data Engineer with expertise in Azure Data Factory, Azure Databricks, and Azure Data Fabric to lead the migration of our existing data pipeline and processing infrastructure. The ideal candidate will have a strong background in Azure cloud data services, big data analytics, and data engineering, with specific experience in Azure Data Fabric. We are looking for someone who has at least 6 months of hands-on experience with Azure Data Fabric or has successfully completed at least one migration to Azure Data Fabric.
Key Responsibilities:
- Assess the current data architecture using Azure Data Factory and Databricks and develop a detailed migration plan to Azure Data Fabric.
- Design and implement end-to-end data pipelines within Azure Data Fabric, including data ingestion, transformation, storage, and analytics.
- Optimize data workflows to leverage Azure Data Fabric's unified platform for data integration, big data processing, and real-time analytics.
- Ensure seamless integration of data from SharePoint and other sources into Azure Data Fabric, maintaining data quality and integrity.
- Collaborate with business analysts and business stakeholders to align data strategies and optimize the data environment for machine learning and AI workloads.
- Implement security best practices, including data governance, access control, and monitoring within Azure Data Fabric.
- Conduct performance tuning and optimization for data storage and processing within Azure Data Fabric to ensure high availability and cost efficiency.
Key Requirements:
- Proven experience (5+ years) in Azure data engineering with a strong focus on Azure Data Factory and Azure Databricks.
- At least 6 months of hands-on experience with Azure Data Fabric or completion of one migration to Azure Data Fabric.
- Hands-on experience in designing, building, and managing data pipelines, data lakes, and data warehouses on Azure.
- Expertise in Spark, SQL, and data transformation techniques within Azure environments.
- Strong understanding of data governance, security, and compliance in cloud environments.
- Experience with migrating data architectures and optimizing workflows on cloud platforms.
- Ability to work collaboratively with cross-functional teams and communicate technical concepts effectively to non-technical stakeholders.
- Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect Expert) are a plus.
Key requirements:
- The person should have at least 6 months of work experience in Data Fabric. Make sure the experience is not less than 6 months
- Solid technical skills: data bricks, data fabric and data factory
- Polished, good communication and interpersonal skills
- The Person should have at least 6 years of experience in Databricks, Datafactory.

About Programmingcom
About
Similar jobs
What You’ll Do:
As a Sr. Data Scientist, you will work closely across DeepIntent Data Science teams located in New York, India, and Bosnia. The role will focus on building predictive models, implementing data-driven solutions to maximize ad effectiveness. You will also lead efforts in generating analyses and insights related to the measurement of campaign outcomes, Rx, patient journey, and supporting the evolution of the DeepIntent product suite. Activities in this position include developing and deploying models in production, reading campaign results, analyzing medical claims, clinical, demographic and clickstream data, performing analysis and creating actionable insights, summarizing, and presenting results and recommended actions to internal stakeholders and external clients, as needed.
- Explore ways to create better predictive models.
- Analyze medical claims, clinical, demographic and clickstream data to produce and present actionable insights.
- Explore ways of using inference, statistical, and machine learning techniques to improve the performance of existing algorithms and decision heuristics.
- Design and deploy new iterations of production-level code.
- Contribute posts to our upcoming technical blog.
Who You Are:
- Bachelor’s degree in a STEM field, such as Statistics, Mathematics, Engineering, Biostatistics, Econometrics, Economics, Finance, or Data Science.
- 5+ years of working experience as a Data Scientist or Researcher in digital marketing, consumer advertisement, telecom, or other areas requiring customer-level predictive analytics.
- Advanced proficiency in performing statistical analysis in Python, including relevant libraries, is required.
- Experience working with data processing, transformation and building model pipelines using tools such as Spark, Airflow, and Docker.
- You have an understanding of the ad-tech ecosystem, digital marketing and advertising data and campaigns or familiarity with the US healthcare patient and provider systems (e.g. medical claims, medications).
- You have varied and hands-on predictive machine learning experience (deep learning, boosting algorithms, inference…).
- You are interested in translating complex quantitative results into meaningful findings and interpretable deliverables, and communicating with less technical audiences orally and in writing.
- You can write production level code, work with Git repositories.
- Active Kaggle participant.
- Working experience with SQL.
- Familiar with medical and healthcare data (medical claims, Rx, preferred).
- Conversant with cloud technologies such as AWS or Google Cloud.
Job Description:
Technical Lead – Full Stack
Experience: 8–12 years (Strong candidates Java 50% - React 50%)
Location – Bangalore/Hyderabad
Interview Levels – 3 Rounds
Tech Stack: Java, Spring Boot, Microservices, React, SQL
Focus: Hands-on coding, solution design, team leadership, delivery ownership
Must-Have Skills (Depth)
Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.
Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.
Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.
React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).
SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.
Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.
DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.
Job Title: Lead Java Developer
Experience Required: 7+ Years
Location: Mumbai
About the Role
We are seeking a highly skilled Senior Backend Developer with deep expertise in building scalable, high-performance applications. The ideal candidate will have strong hands-on experience with Java, Spring Boot, Microservices, and distributed systems. You will play a key role in designing and optimizing APIs, architecting robust systems, and collaborating with cross-functional teams to deliver innovative solutions.
Key Responsibilities
- Design, develop, and maintain scalable backend systems using Java, Spring Boot, and Microservices architecture.
- Build and optimize APIs for large-scale applications ensuring high performance and reliability.
- Apply clean coding principles, SOLID design patterns, and clean architecture to deliver maintainable and robust code.
- Work with distributed systems technologies (Kafka, ELK, in-memory databases, Cassandra, or similar).
- Write efficient SQL queries and integrate with relational and NoSQL databases.
- Collaborate with product managers, architects, and other developers to define and align on technical decisions.
- Identify and solve complex problems with innovative and scalable solutions.
- Drive best practices in coding standards, system design, and performance optimization.
- Communicate complex technical concepts clearly across teams and stakeholders.
Skills & Qualifications
- 7+ years of experience as a Backend Developer.
- Strong expertise in Java, Spring Boot, Microservices, SQL.
- Significant experience in API design and optimization for enterprise-scale applications.
- Proficiency in distributed systems & related technologies (Kafka, ELK, in-memory DBs, Cassandra, etc.).
- Strong understanding of object-oriented engineering principles (SOLID, clean architecture).
- Excellent problem-solving and analytical skills with the ability to simplify complex concepts.
- Strong communication and collaboration skills, with the ability to influence and align teams.
What you will do:
• Write articles, blog posts, and web content
• Edit and proofread content for clarity and grammar
• Research topics to produce fresh, accurate content
What you will need:
• Strong writing skills
• Good grammar and attention to detail
• Ability to meet deadlines
Experience: 1 Years
Location: Rajkot, Gujarat
Position Type: Full-time, in-office role | No remote work available
KeyLogic Infotech Private Limited offers our customers a wealth of technical and business expertise. We build complex, diverse web and mobile solutions for any business need.
Our knowledge and experience translate to added value and peace of mind for our customers. With KeyLogic Infotech you get quality software and perfect service every time. We are seeking a skilled PHP Laravel Developer to join our team and work on client-based projects.
Overview:
As a Laravel Developer, you will be responsible for developing high-quality web applications using PHP, Laravel framework, MySQL, HTML, CSS, and jQuery. You will collaborate with a team of developers and project managers to deliver robust, scalable, and user-friendly solutions.
CodeIgniter developers, seize the opportunity to transition to Laravel now! Benefit from our comprehensive training, ensuring a smooth switch, and enjoy a salary increment that recognizes your valuable CodeIgniter expertise.
We are also accepting applications for this profile from php developer, laravel developer, web developer, php laravel developer.
Key Responsibilities:
- Addressing the client's and the development team's goals for the project.
- Using Laravel to create and build web applications.
- Identifying and fixing problems with implementation and debug builds.
- Working on projects with front-end and back-end developers.
- Testing the backend's and users' functionality.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field
- Unit testing, managing REST APIs, and working with PHP are all desirable.
- A thorough knowledge of Laravel application design.
- JavaScript and HTML expertise.
- Abilities to solve problems and a critical perspective.
- Excellent interpersonal skills.
- The capacity and willingness to study.
REQUIREMENTS
Core skills:
● Technical Experience (Must have) - working knowledge of any visualization tool
(Metabase,Tableau, QlikSense, Looker, Superset, Power BI etc), strong SQL & Python,
Excel/Gsheet
● Product Knowledge (Must have)- Knowledge of Google Analytics/ BigQuery or
Mixpanel, must have worked on A/B testing & events writing.Must be familiar with
product (app,website) data and have good product sense
● Analytical Thinking: Outstanding analytical and problem-solving skills. ability to break
the problem statement during execution.
Core Experience:
● Overall experience of 2-5 years in the analytics domain
● He/she should have hands-on experience in the analytics domain around making Data
Story Dashboards, doing RCA & analyzing data.
● Understand and hands-on experience of the Product i.e funnels, A/B experiment etc.
● Ability to define the right metric for a specific product feature or experiment & do the
impact analysis.
● Ability to explain complex data insights to a wider audience & tell us the next steps &
recommendations
● Experience in analyzing, exploring, and mining large data sets to support reporting and
ad-hoc analysis
● Strong attention to detail and accuracy of output.
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
Undertanding of colors and open to understand client requirement
Learning zeal
Postive attitude









