Cutshort logo
Pinghala logo
Senior Data Consultant (Talend DI)
Senior Data Consultant (Talend DI)
Pinghala's logo

Senior Data Consultant (Talend DI)

Ashwini Dhaipule's profile picture
Posted by Ashwini Dhaipule
3 - 5 yrs
₹6L - ₹10L / yr
Pune
Skills
PowerBI
Data Visualization
Data architecture
Informatica PowerCenter
SQL
Business Intelligence (BI)
Cloud Computing
skill iconData Analytics
Talend DI
talend

Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.

Responsibilities:

  • Manage the customer relationship by serving as the single point of contact before, during and after engagements.
  • Architect data management solutions.
  • Provide technical leadership to other consultants and/or customer/partner resources.
  • Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
  • Supervise and mentor all intermediate and junior level team members.
  • Provide regular reports to communicate status both internally and externally.
  • Qualifications:
  • A typical profile that would suit this position would be if the following background:
  • A graduate from a reputed engineering college 
  • An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
  • A willingness to work with a small team in a fast-growing environment.
  • A good knowledge of Business Intelligence concepts

 

Mandatory Requirements:

  • Knowledge of Business Intelligence
  • Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
  • Knowledge of SQL
  • Excellent English and communication skills
  • Intelligent, quick to learn new technologies
  • Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
     

 

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Pinghala

Founded :
2018
Type
Size :
20-100
Stage :
Profitable
About
Pingahla was founded by a group of people passionate about making the world a better place by harnessing the power of Data. We are a data management firm with offices in New York and India. Our mission is to help transform the way companies operate and think about their business. We make it easier to adopt and stay ahead of the curve in the ever-changing digital landscape. One of our core beliefs is excellence in everything we do!
Read more
Connect with the team
Profile picture
Ashwini Dhaipule
Company social profiles
bloginstagramlinkedintwitterfacebook

Similar jobs

UpSolve Solutions LLP
Shaurya Kuchhal
Posted by Shaurya Kuchhal
Mumbai, Navi Mumbai
2 - 6 yrs
₹4L - ₹8L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL
MS-PowerPoint

Company Description

UpSolve is a Gen AI and Vision AI startup that helps businesses solve their problems by building custom solutions that drive strategic business decisions. Whether your business is facing time constraints or a lack of resources, UpSolve can help. We build enterprise grade AI solutions with focus on increasing ROI.


Role Description

This is a full-time hybrid role for a Business Analyst located in Mumbai.


Please note: This is an onsite role and good communication skills are expected (oral + written)


Responsibilities

1. Understand existing system integrations for the client.

2. Map and identify gaps in existing systems.

3. Ideate, Advise and Implement AI Solutions to optimize business process.

4. Collaborate with multiple teams and stakeholders.


Qualifications

  • MBA with focus on Business Analytics or Bachelor's degree in Computer Science or IT
  • Minimum 4 Years of Experience
  • Strong written, verbal and collaboration skills
  • Immediate Joiner (Less than 5 days)


Work Location: Mumbai, Work from Office

Read more
Series 'A' funded Silicon Valley based BI startup
Series 'A' funded Silicon Valley based BI startup
Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
4 - 6 yrs
₹30L - ₹45L / yr
Data engineering
Data Engineer
skill iconScala
Data Warehouse (DWH)
Big Data
+7 more
It is the leader in capturing technographics-powered buying intent, helps
companies uncover the 3% of active buyers in their target market. It evaluates
over 100 billion data points and analyzes factors such as buyer journeys, technology
adoption patterns, and other digital footprints to deliver market & sales intelligence.
Its customers have access to the buying patterns and contact information of
more than 17 million companies and 70 million decision makers across the world.

Role – Data Engineer

Responsibilities

 Work in collaboration with the application team and integration team to
design, create, and maintain optimal data pipeline architecture and data
structures for Data Lake/Data Warehouse.
 Work with stakeholders including the Sales, Product, and Customer Support
teams to assist with data-related technical issues and support their data
analytics needs.
 Assemble large, complex data sets from third-party vendors to meet business
requirements.
 Identify, design, and implement internal process improvements: automating
manual processes, optimizing data delivery, re-designing infrastructure for
greater scalability, etc.
 Build the infrastructure required for optimal extraction, transformation, and
loading of data from a wide variety of data sources using SQL, Elasticsearch,
MongoDB, and AWS technology.
 Streamline existing and introduce enhanced reporting and analysis solutions
that leverage complex data sources derived from multiple internal systems.

Requirements
 5+ years of experience in a Data Engineer role.
 Proficiency in Linux.
 Must have SQL knowledge and experience working with relational databases,
query authoring (SQL) as well as familiarity with databases including Mysql,
Mongo, Cassandra, and Athena.
 Must have experience with Python/Scala.
 Must have experience with Big Data technologies like Apache Spark.
 Must have experience with Apache Airflow.
 Experience with data pipeline and ETL tools like AWS Glue.
 Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Read more
Product and Service based company
Product and Service based company
Agency job
via Jobdost by Sathish Kumar
Hyderabad, Ahmedabad
8 - 12 yrs
₹15L - ₹30L / yr
SQL server
Relational Database (RDBMS)
NOSQL Databases
Oracle
Database Design
+3 more

Job Description

Job Responsibilities

  • Design and implement robust database solutions including

    • Security, backup and recovery

    • Performance, scalability, monitoring and tuning,

    • Data management and capacity planning,

    • Planning, and implementing failover between database instances.

  • Create data architecture strategies for each subject area of the enterprise data model.

  • Communicate plans, status and issues to higher management levels.

  • Collaborate with the business, architects and other IT organizations to plan a data strategy, sharing important information related to database concerns and constrains

  • Produce all project data architecture deliverables..

  • Create and maintain a corporate repository of all data architecture artifacts.

 

Skills Required:

  • Understanding of data analysis, business principles, and operations

  • Software architecture and design Network design and implementation

  • Data visualization, data migration and data modelling

  • Relational database management systems

  • DBMS software, including SQL Server  

  • Database and cloud computing design, architectures and data lakes

  • Information management and data processing on multiple platforms 

  • Agile methodologies and enterprise resource planning implementation

  • Demonstrate database technical functionality, such as performance tuning, backup and recovery, monitoring.

  • Excellent skills with advanced features such as database encryption, replication, partitioning, etc.

  • Strong problem solving, organizational and communication skill.

Read more
RandomTrees
at RandomTrees
1 recruiter
Amareswarreddt yaddula
Posted by Amareswarreddt yaddula
Remote only
5 - 10 yrs
₹1L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+6 more

Job Title: Senior Data Engineer

Experience: 8Yrs to 11Yrs

Location: Remote

Notice: Immediate or Max 1Month

Role: Permanent Role


Skill set: Google Cloud Platform, Big Query, Java, Python Programming Language, Airflow, Data flow, Apache Beam.


Experience required:

5 years of experience in software design and development with 4 years of experience in the data engineering field is preferred.

2 years of Hands-on experience in GCP cloud data implementation suites such as Big Query, Pub Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage, etc.

Strong experience and understanding of very large-scale data architecture, solutions, and operationalization of data warehouses, data lakes, and analytics platforms.

Mandatory 1 year of software development skills using Java or Python.

Extensive hands-on experience working with data using SQL and Python.


Must Have: GCP, Big Query, Airflow, Data flow, Python, Java.


GCP knowledge must

Java as programming language(preferred)

Big Query, Pub-Sub, Data Flow/Apache Beam, Airflow/Composer, Cloud Storage,

Python

Communication should be good.


Read more
Aequor Technologies
at Aequor Technologies
1 recruiter
Ranjana Guru
Posted by Ranjana Guru
Remote only
8 - 15 yrs
₹1L - ₹20L / yr
skill iconData Analytics
Datawarehousing
Data architecture
SAP HANA

Required Skills:

  • Proven work experience as an Enterprise / Data / Analytics Architect - Data Platform in HANA XSA, XS, Data Intelligence and SDI
  • Can work on new and existing architecture decision in HANA XSA, XS, Data Intelligence and SDI
  • Well versed with data architecture principles, software / web application design, API design, UI / UX capabilities, XSA / Cloud foundry architecture
  • In-depth understand of database structure (HANA in-memory) principles.
  • In-depth understand of ETL solutions and data integration strategy.
  • Excellent knowledge of Software and Application design, API, XSA, and microservices concepts

 

Roles & Responsibilities:

  • Advise and ensure compliance of the defined Data Architecture principle.
  • Identifies new technologies update and development tools including new release/upgrade/patch as required. 
  • Analyzes technical risks and advises on risk mitigation strategy.
  • Advise and ensures compliance to existing and development required data and reporting standard including naming convention.

 

The time window is ideally AEST (8 am till 5 pm) which means starting at 3:30 am IST. We understand it can be very early for an SME supporting from India. Hence, we can consider the candidates who can support from at least 7 am IST (earlier is possible).

Read more
DUNNHUMBY IT SERVICES INDIA
Yamini Rawat
Posted by Yamini Rawat
Gurugram
2 - 5 yrs
₹2L - ₹9L / yr
skill iconPython
SQL
skill iconMachine Learning (ML)
Forecasting

Most companies try to meet expectations, dunnhumby exists to defy them. Using big data, deep expertise and AI-driven platforms to decode the 21st century human experience – then redefine it in meaningful and surprising ways that put customers first. Across digital, mobile and retail. For brands like Tesco, Coca-Cola, Procter & Gamble and PepsiCo.

We’re looking for an Applied Data Scientist who expects more from their career. It’s a chance to apply your expertise to distil complex problems into compelling insights using the best of machine learning and human creativity to deliver effective and impactful solutions for clients. Joining our advanced data science team, you’ll investigate, develop, implement and deploy a range of complex applications and components while working alongside super-smart colleagues challenging and rewriting the rules, not just following them.

What we expect from you 

  • Degree in Statistics, Maths, Physics, Economics or similar field
  • Programming skills (Python and SQL are a must have)
  • Analytical Techniques and Technology
  • Experience with and passion for connecting your work directly to the customer experience, making a real and tangible impact.
  • Logical thinking and problem solving
  • Strong communication skills
  • Statistical Modelling and experience of applying data science into client problems
  • 2 to 5 years of experience required


What you can expect from us

We won’t just meet your expectations. We’ll defy them. So you’ll enjoy the comprehensive rewards package you’d expect from a leading technology company. But also, a degree of personal flexibility you might not.

Plus, thoughtful perks, like early finish Friday and your birthday off.

You’ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn.

And we don’t just talk about diversity and inclusion. We live it every day – with thriving networks including dh Women’s Network, dh Proud, dh Parent’s & Carer’s, dh One and dh Thrive as the living proof. Everyone’s invited.

Our approach to Flexible Working

At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work.

We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process.


 

Read more
Leena AI
at Leena AI
13 recruiters
Preethi Gothandam
Posted by Preethi Gothandam
Remote only
2 - 8 yrs
₹25L - ₹40L / yr
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
skill iconData Science
skill iconData Analytics

Responsibilities: 

  • Improve robustness of Leena AI current NLP stack 
  • Increase zero shot learning capability of Leena AI current NLP stack 
  • Opportunity to add/build new NLP architectures based on requirements 
  • Manage End to End lifecycle of the data in the system till it achieves more than 90% accuracy 
  • Manage a NLP team 

Page BreakRequirements: 

  • Strong understanding of linear algebra, optimisation, probability, statistics 
  • Experience in the data science methodology from exploratory data analysis, feature engineering, model selection, deployment of the model at scale and model evaluation 
  • Experience in deploying NLP architectures in production 
  • Understanding of latest NLP architectures like transformers is good to have 
  • Experience in adversarial attacks/robustness of DNN is good to have 
  • Experience with Python Web Framework (Django), Analytics and Machine Learning frameworks like Tensorflow/Keras/Pytorch. 
Read more
15 years US based Product Company
15 years US based Product Company
Agency job
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Our client company is into Analytics. (RF1)
Our client company is into Analytics. (RF1)
Agency job
via Multi Recruit by Ragul Ragul
Bengaluru (Bangalore)
3 - 5 yrs
₹12L - ₹14L / yr
Data Engineer
Big Data
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
+2 more
  •  We are looking for a Data Engineer with 3-5 years experience in Python, SQL, AWS (EC2, S3, Elastic Beanstalk, API Gateway), and Java.
  • The applicant must be able to perform Data Mapping (data type conversion, schema harmonization) using Python, SQL, and Java.
  • The applicant must be familiar with and have programmed ETL interfaces (OAUTH, REST API, ODBC) using the same languages.
  • The company is looking for someone who shows an eagerness to learn and who asks concise questions when communicating with teammates.
Read more
NeenOpal Intelligent Solutions Private Limited
Pavel Gupta
Posted by Pavel Gupta
Remote, Bengaluru (Bangalore)
2 - 5 yrs
₹6L - ₹12L / yr
ETL
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
skill iconPostgreSQL

We are actively seeking a Senior Data Engineer experienced in building data pipelines and integrations from 3rd party data sources by writing custom automated ETL jobs using Python. The role will work in partnership with other members of the Business Analytics team to support the development and implementation of new and existing data warehouse solutions for our clients. This includes designing database import/export processes used to generate client data warehouse deliverables.

 

Requirements
  • 2+ Years experience as an ETL developer with strong data architecture knowledge around data warehousing concepts, SQL development and optimization, and operational support models.
  • Experience using Python to automate ETL/Data Processes jobs.
  • Design and develop ETL and data processing solutions using data integration tools, python scripts, and AWS / Azure / On-Premise Environment.
  • Experience / Willingness to learn AWS Glue / AWS Data Pipeline / Azure Data Factory for Data Integration.
  • Develop and create transformation queries, views, and stored procedures for ETL processes, and process automation.
  • Document data mappings, data dictionaries, processes, programs, and solutions as per established standards for data governance.
  • Work with the data analytics team to assess and troubleshoot potential data quality issues at key intake points such as validating control totals at intake and then upon transformation, and transparently build lessons learned into future data quality assessments
  • Solid experience with data modeling, business logic, and RESTful APIs.
  • Solid experience in the Linux environment.
  • Experience with NoSQL / PostgreSQL preferred
  • Experience working with databases such as MySQL, NoSQL, and Postgres, and enterprise-level connectivity experience (such as connecting over TLS and through proxies).
  • Experience with NGINX and SSL.
  • Performance tune data processes and SQL queries, and recommend and implement data process optimization and query tuning techniques.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos