Cutshort logo
Cognitive Clouds Software Pvt Ltd's logo

Data Engineer

Talent Acquisition's profile picture
Posted by Talent Acquisition
4 - 6 yrs
Best in industry
Bengaluru (Bangalore)
Skills
Snow flake schema
ETL
Data modeling

We are seeking a skilled Data Engineer with a strong proficiency in SQL and extensive experience in data modeling. The ideal candidate will be adept at designing and implementing robust data architectures, including snowflake schemas, ER diagrams, and various types of tables such as transaction, dimension, surrogate keys, foreign keys, and primary keys.


  • Over 4 years of experience as a data engineer or in a similar role.
  • Technical expertise with data models, data mining, and segmentation techniques
  • Knowledge of programming languages (e.g. Java and Python)
  • Hands-on experience with SQL database design
  • Develop and maintain efficient SQL queries for data extraction, transformation, and loading (ETL) processes.
  • Design and implement data models, including snowflake schemas and ER diagrams, to support business requirements.
  • Collaborate with cross-functional teams to understand data needs and requirements, and translate them into scalable database solutions.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Cognitive Clouds Software Pvt Ltd

Founded :
2012
Type
Size :
100-1000
Stage
About
At CognitiveClouds we build world class applications for mobile, web and cloud. We collaborate with well known brands and promising startups architecting, developing, designing and shipping smart software products
Read more
Company video
Cognitive Clouds Software Pvt Ltd's video section
Cognitive Clouds Software Pvt Ltd's video section
Connect with the team
Profile picture
Praveen Gopinath
Profile picture
GuruPrasad Jayarao
Profile picture
Jayaprakash Rao
Profile picture
Santosh BR
Profile picture
Pooja Rai
Profile picture
Sowmya Srinivas
Company social profiles
linkedintwitterfacebook

Similar jobs

Quinnox
at Quinnox
2 recruiters
MidhunKumar T
Posted by MidhunKumar T
Bengaluru (Bangalore), Mumbai
10 - 15 yrs
₹30L - ₹35L / yr
ADF
azure data lake services
SQL Azure
azure synapse
Spark
+4 more

Mandatory Skills: Azure Data Lake Storage, Azure SQL databases, Azure Synapse, Data Bricks (Pyspark/Spark), Python, SQL, Azure Data Factory.


Good to have: Power BI, Azure IAAS services, Azure Devops, Microsoft Fabric


Ø Very strong understanding on ETL and ELT

Ø Very strong understanding on Lakehouse architecture.

Ø Very strong knowledge in Pyspark and Spark architecture.

Ø Good knowledge in Azure data lake architecture and access controls

Ø Good knowledge in Microsoft Fabric architecture

Ø Good knowledge in Azure SQL databases

Ø Good knowledge in T-SQL

Ø Good knowledge in CI /CD process using Azure devops

Ø Power BI

Read more
Gurugram, Pune, Mumbai, Bengaluru (Bangalore), Chennai, Nashik
4 - 12 yrs
₹12L - ₹15L / yr
Data engineering
Data modeling
data pipeline
Data integration
Data Warehouse (DWH)
+12 more

 

 

Designation – Deputy Manager - TS


Job Description

  1. Total of  8/9 years of development experience Data Engineering . B1/BII role
  2. Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
  3. Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
  4. Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
  5. Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
  6. Strong Python skill .
  7. Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
  8. Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
  9. Life Science & Healthcare domain background will be a plus

Qualifications

BE/Btect/ME/MTech

 

Read more
Piako
PiaKo Store
Posted by PiaKo Store
Kolkata
4 - 8 yrs
₹12L - ₹24L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
ETL

We are a rapidly expanding global technology partner, that is looking for a highly skilled Senior (Python) Data Engineer to join their exceptional Technology and Development team. The role is in Kolkata. If you are passionate about demonstrating your expertise and thrive on collaborating with a group of talented engineers, then this role was made for you!

At the heart of technology innovation, our client specializes in delivering cutting-edge solutions to clients across a wide array of sectors. With a strategic focus on finance, banking, and corporate verticals, they have earned a stellar reputation for their commitment to excellence in every project they undertake.

We are searching for a senior engineer to strengthen their global projects team. They seek an experienced Senior Data Engineer with a strong background in building Extract, Transform, Load (ETL) processes and a deep understanding of AWS serverless cloud environments.

As a vital member of the data engineering team, you will play a critical role in designing, developing, and maintaining data pipelines that facilitate data ingestion, transformation, and storage for our organization.

Your expertise will contribute to the foundation of our data infrastructure, enabling data-driven decision-making and analytics.

Key Responsibilities:

  • ETL Pipeline Development: Design, develop, and maintain ETL processes using Python, AWS Glue, or other serverless technologies to ingest data from various sources (databases, APIs, files), transform it into a usable format, and load it into data warehouses or data lakes.
  • AWS Serverless Expertise: Leverage AWS services such as AWS Lambda, AWS Step Functions, AWS Glue, AWS S3, and AWS Redshift to build serverless data pipelines that are scalable, reliable, and cost-effective.
  • Data Modeling: Collaborate with data scientists and analysts to understand data requirements and design appropriate data models, ensuring data is structured optimally for analytical purposes.
  • Data Quality Assurance: Implement data validation and quality checks within ETL pipelines to ensure data accuracy, completeness, and consistency.
  • Performance Optimization: Continuously optimize ETL processes for efficiency, performance, and scalability, monitoring and troubleshooting any bottlenecks or issues that may arise.
  • Documentation: Maintain comprehensive documentation of ETL processes, data lineage, and system architecture to ensure knowledge sharing and compliance with best practices.
  • Security and Compliance: Implement data security measures, encryption, and compliance standards (e.g., GDPR, HIPAA) as required for sensitive data handling.
  • Monitoring and Logging: Set up monitoring, alerting, and logging systems to proactively identify and resolve data pipeline issues.
  • Collaboration: Work closely with cross-functional teams, including data scientists, data analysts, software engineers, and business stakeholders, to understand data requirements and deliver solutions.
  • Continuous Learning: Stay current with industry trends, emerging technologies, and best practices in data engineering and cloud computing and apply them to enhance existing processes.

Qualifications:

  • Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
  • Proven experience as a Data Engineer with a focus on ETL pipeline development.
  • Strong proficiency in Python programming.
  • In-depth knowledge of AWS serverless technologies and services.
  • Familiarity with data warehousing concepts and tools (e.g., Redshift, Snowflake).
  • Experience with version control systems (e.g., Git).
  • Strong SQL skills for data extraction and transformation.
  • Excellent problem-solving and troubleshooting abilities.
  • Ability to work independently and collaboratively in a team environment.
  • Effective communication skills for articulating technical concepts to non-technical stakeholders.
  • Certifications such as AWS Certified Data Analytics - Specialty or AWS Certified DevOps Engineer are a plus.

Preferred Experience:

  • Knowledge of data orchestration and workflow management tools
  • Familiarity with data visualization tools (e.g., Tableau, Power BI).
  • Previous experience in industries with strict data compliance requirements (e.g., insurance, finance) is beneficial.

What You Can Expect:

- Innovation Abounds: Join a company that constantly pushes the boundaries of technology and encourages creative thinking. Your ideas and expertise will be valued and put to work in pioneering solutions.

- Collaborative Excellence: Be part of a team of engineers who are as passionate and skilled as you are. Together, you'll tackle challenging projects, learn from each other, and achieve remarkable results.

- Global Impact: Contribute to projects with a global reach and make a tangible difference. Your work will shape the future of technology in finance, banking, and corporate sectors.

They offer an exciting and professional environment with great career and growth opportunities. Their office is located in the heart of Salt Lake Sector V, offering a terrific workspace that's both accessible and inspiring. Their team members enjoy a professional work environment with regular team outings. Joining the team means becoming part of a vibrant and dynamic team where your skills will be valued, your creativity will be nurtured, and your contributions will make a difference. In this role, you can work alongside some of the brightest minds in the industry.

If you're ready to take your career to the next level and be part of a dynamic team that's driving innovation on a global scale, we want to hear from you.

Apply today for more information about this exciting opportunity.

Onsite Location: Kolkata, India (Salt Lake Sector V)


Read more
Kaleidofin
at Kaleidofin
3 recruiters
Poornima B
Posted by Poornima B
Chennai, Bengaluru (Bangalore)
2 - 4 yrs
Best in industry
PowerBI
Business Intelligence (BI)
skill iconPython
Tableau
SQL
+1 more
We are looking for a developer to design and deliver strategic data-centric insights leveraging the next generation analytics and BI technologies. We want someone who is data-centric and insight-centric, less report centric. We are looking for someone wishing to make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.

Responsibilities:
  • Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
  • Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
  • Become an expert on data and trends, both internal and external to Kaleidofin.
  • Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
  • Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
  • Automate scheduling and distribution of reports and support auditing and value realization.
  • Partner with enterprise architects to define and ensure proposed.
  • Business Intelligence solutions adhere to an enterprise reference architecture.
  • Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks.
 Requirements:
  • Experience leading development efforts through all phases of SDLC.
  • 2+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
  • Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
  • Hands on experience in SQL, data management, and scripting (preferably Python).
  • Strong data visualisation design skills, data modeling and inference skills.
  • Hands-on and experience in managing small teams.
  • Financial services experience preferred, but not mandatory.
  • Strong knowledge of architectural principles, tools, frameworks, and best practices.
  • Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
  • Preferred candidates with less than 30 days notice period.
Read more
Mactores Cognition Private Limited
Remote only
5 - 15 yrs
₹5L - ₹21L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconAmazon Web Services (AWS)
Amazon S3
+3 more

Mactores is a trusted leader among businesses in providing modern data platform solutions. Since 2008, Mactores have been enabling businesses to accelerate their value through automation by providing End-to-End Data Solutions that are automated, agile, and secure. We collaborate with customers to strategize, navigate, and accelerate an ideal path forward with a digital transformation via assessments, migration, or modernization.


We are looking for a DataOps Engineer with expertise while operating a data lake. Amazon S3, Amazon EMR, and Apache Airflow for workflow management are used to build the data lake.


You have experience of building and running data lake platforms on AWS. You have exposure to operating PySpark-based ETL Jobs in Apache Airflow and Amazon EMR. Expertise in monitoring services like Amazon CloudWatch.


If you love solving problems using yo, professional services background, usual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.


What you will do?


  • Operate the current data lake deployed on AWS with Amazon S3, Amazon EMR, and Apache Airflow
  • Debug and fix production issues in PySpark.
  • Determine the RCA (Root cause analysis) for production issues.
  • Collaborate with product teams for L3/L4 production issues in PySpark.
  • Contribute to enhancing the ETL efficiency
  • Build CloudWatch dashboards for optimizing the operational efficiencies
  • Handle escalation tickets from L1 Monitoring engineers
  • Assign the tickets to L1 engineers based on their expertise


What are we looking for?


  • AWS data Ops engineer.
  • Overall 5+ years of exp in the software industry Exp in developing architecture data applications using python or scala, Airflow, and Kafka on AWS Data platform Experience and expertise.
  • Must have set up or led the project to enable Data Ops on AWS or any other cloud data platform.
  • Strong data engineering experience on Cloud platform, preferably AWS.
  • Experience with data pipelines designed for reuse and use parameterization.
  • Experience of pipelines was designed to solve common ETL problems.
  • Understanding or experience on various AWS services can be codified for enabling DataOps like Amazon EMR, Apache Airflow.
  • Experience in building data pipelines using CI/CD infrastructure.
  • Understanding of Infrastructure as code for DataOps ennoblement.
  • Ability to work with ambiguity and create quick PoCs.


You will be preferred if


  • Expertise in Amazon EMR, Apache Airflow, Terraform, CloudWatch
  • Exposure to MLOps using Amazon Sagemaker is a plus.
  • AWS Solutions Architect Professional or Associate Level Certificate
  • AWS DevOps Professional Certificate


Life at Mactores


We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.


1. Be one step ahead

2. Deliver the best

3. Be bold

4. Pay attention to the detail

5. Enjoy the challenge

6. Be curious and take action

7. Take leadership

8. Own it

9. Deliver value

10. Be collaborative


We would like you to read more details about the work culture on https://mactores.com/careers 


The Path to Joining the Mactores Team

At Mactores, our recruitment process is structured around three distinct stages:


Pre-Employment Assessment: 

You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.


Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.


HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.


At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.


Read more
APL Audit Operations India
Bengaluru (Bangalore)
1 - 4 yrs
₹7L - ₹12L / yr
SQL Server Integration Services (SSIS)
SQL
ETL
Informatica
Data Warehouse (DWH)
+4 more

About Company:

Working with a multitude of clients populating the FTSE and Fortune 500s, Audit Partnership is a people focused organization with a strong belief in our employees. We hire the best people to provide the best services to our clients.

APL offers profit recovery services to organizations of all sizes across a number of sectors. APL was borne out of a desire to offer an alternative from the stagnant service provision on offer in the profit recovery industry.

Every year we cover million of pounds for our clients and also work closely with them, sharing our audit findings to minimize future losses. Our dedicated and highly experienced audit teams utilize progressive & dynamic financial service solutions & industry leading technology to achieve maximum success.

We provide dynamic work environments focused on delivering data-driven solutions at a rapidly increased pace over traditional development. Be a part of our passionate and motivated team who are excited to use the latest in software technologies within financial services.

Headquartered in the UK, we have expanded from a small team in 2002 to a market leading organization serving clients across the globe while keeping our clients at the heart of all decisions we make.


Job description:

We are looking for a high-potential, enthusiastic SQL Data Engineer with a strong desire to build a career in data analysis, database design and application solutions. Reporting directly to our UK based Technology team, you will provide support to our global operation in the delivery of data analysis, conversion, and application development to our core audit functions.

Duties will include assisting with data loads, using T-SQL to analyse data, front-end code changes, data housekeeping, data administration, and supporting the Data Services team as a whole.  Your contribution will grow in line with your experience and skills, becoming increasingly involved in the core service functions and client delivery.  A self-starter with a deep commitment to the highest standards of quality and customer service. We are offering a fantastic career opportunity, working for a leading international financial services organisation, serving the world’s largest organisations.

 

What we are looking for:

  • 1-2 years of previous experience in a similar role
  • Data analysis and conversion skills using Microsoft SQL Server is essential
  • An understanding of relational database design and build
  • Schema design, normalising data, indexing, query performance analysis
  • Ability to analyse complex data to identify patterns and detect anomalies
  • Assisting with ETL design and implementation projects
  • Knowledge or experience in one or more of the key technologies below would be preferable:
    • Microsoft SQL Server (SQL Server Management Studio, Stored Procedure writing etc)
    • T-SQL
    • Programming languages (C#, VB, Python etc)
    • Use of Python to manipulate and import data
    •  
    • Experience of ETL/automation advantageous but not essential (SSIS/Prefect/Azure)
  • A self-starter who can drive projects with minimal guidance
  • Meeting stakeholders to agree system requirements
  • Someone who is enthusiastic and eager to learn
  • Very good command of English and excellent communication skills

 

Perks & Benefits:

  • A fantastic work life balance
  • Competitive compensation and benefits
  • Exposure of working with Fortune 500 organization
  • Expert guidance and nurture from global leaders
  • Opportunities for career and personal advancement with our continued global growth strategy
  • Industry leading training programs
  • A working environment that is exciting, fun and engaging

 

Read more
Pune, Mumbai, Bengaluru (Bangalore), Chennai, Noida, Hyderabad
7 - 12 yrs
₹5L - ₹15L / yr
Snow flake schema
SnowSQL
Snowpipe
There is an urgent requirement for Snowflake Developer in MNC company. Notice Period should be joining in April or 2nd week of May. Having skills like Snowpipe, SnowSQL, Snowflake schema & Snowflake developer.The Budget for this profile is upto 22 LPA.
Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
Avegen India Pvt. Ltd
at Avegen India Pvt. Ltd
2 recruiters
Shubham Shinde
Posted by Shubham Shinde
Pune
3 - 8 yrs
₹3L - ₹20L / yr
Intelligence
Artificial Intelligence (AI)
skill iconDeep Learning
skill iconMachine Learning (ML)
Data extraction
+3 more
Responsibilities
● Frame ML / AI use cases that can improve the company’s product
● Implement and develop ML / AI / Data driven rule based algorithms as software items
● For example, building a chatbot that replies an answer from relevant FAQ, and
reinforcing the system with a feedback loop so that the bot improves
Must have skills:
● Data extraction and ETL
● Python (numpy, pandas, comfortable with OOP)
● Django
● Knowledge of basic Machine Learning / Deep Learning / AI algorithms and ability to
implement them
● Good understanding of SDLC
● Deployed ML / AI model in a mobile / web product
● Soft skills : Strong communication skills & Critical thinking ability

Good to have:
● Full stack development experience
Required Qualification:
B.Tech. / B.E. degree in Computer Science or equivalent software engineering
Read more
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
Business Intelligence (BI)
Data modeling
PowerBI
Microsoft SSRS
Microsoft Business Intelligence (MSBI)
+1 more
Hi All
Greetings from CareerNet Technologies !
 
 Its pleasure talking to you.
Please find below the details:
Please find below the details:
Role: Power BI Developer
Company: KOCH (https://www.kochind.com/" target="_blank">https://www.kochind.com)
Type: Permanent (Direct payroll)
Edu: Any Full time Graduates 
Exp : 4+ Yrs
Job Location:  Kundalahalli,Near Brookefield Hospital, Bangalore -560037
And as discussed, PFA the JDs and company details and it's principles.

Job Description: 

  • 3+ years’ experience developing and implementing enterprise-scale reports and dashboards.

  • Proficiency with MS Power BI / SSRS.

  • Knowledge of logical and physical data modeling concepts (relational and dimensional).

  • Understanding of structured query language (SQL).

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos