Cutshort logo
Informatica Cloud
4 - 7 yrs
₹12L - ₹15L / yr
Bengaluru (Bangalore), Hyderabad, Chennai
Skills
Informatica
Data integration
Salesforce
JIRA
Oracle
skill iconNodeJS (Node.js)
Knowledge & Experience:
  • 4-7 years of Industry experience in IT or consulting organizations
  • 3+ years of experience defining and delivering Informatica Cloud Data Integration & Application Integration enterprise applications in lead developer role
  • Must have working knowledge on integrating with Salesforce, Oracle DB, JIRA Cloud  
  • Must have working scripting knowledge (windows or Nodejs) 


Soft Skills 

  • Superb interpersonal skills, both written and verbal, in order to effectively develop materials that are appropriate for variety of audience in business & technical teams
  • Strong presentation skills, successfully present and defend point of view to Business & IT audiences
  • Excellent analysis skills and ability to rapidly learn and take advantage of new concepts, business models, and technologies
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Several years of experience in designing web applications

Founded
Type
Size
Stage
About
N/A
Company social profiles
N/A

Similar jobs

Bengaluru (Bangalore)
3 - 8 yrs
₹14L - ₹20L / yr
Business Intelligence (BI)
PowerBI
Windows Azure
skill iconGit
SVN
+9 more

Power BI Developer(Azure Developer )

Job Description:

Senior visualization engineer with understanding in Azure Data Factory & Databricks to develop and deliver solutions that enable delivery of information to audiences in support of key business processes.

Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business and technical counterparts.

 

Desired Competencies:

  • Strong designing concepts of data visualization centered on business user and a knack of communicating insights visually.
  • Ability to produce any of the charting methods available with drill down options and action-based reporting. This includes use of right graphs for the underlying data with company themes and objects.
  • Publishing reports & dashboards on reporting server and providing role-based access to users.
  • Ability to create wireframes on any tool for communicating the reporting design.
  • Creation of ad-hoc reports & dashboards to visually communicate data hub metrics (metadata information) for top management understanding.
  • Should be able to handle huge volume of data from databases such as SQL Server, Synapse, Delta Lake or flat files and create high performance dashboards.
  • Should be good in Power BI development
  • Expertise in 2 or more BI (Visualization) tools in building reports and dashboards.
  • Understanding of Azure components like Azure Data Factory, Data lake Store, SQL Database, Azure Databricks
  • Strong knowledge in SQL queries
  • Must have worked in full life-cycle development from functional design to deployment
  • Intermediate understanding to format, process and transform data
  • Should have working knowledge of GIT, SVN
  • Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
  • Basic understanding of data modelling and ability to combine data from multiple sources to create integrated reports

 

Preferred Qualifications:

  • Bachelor's degree in Computer Science or Technology
  • Proven success in contributing to a team-oriented environment
Read more
Chennai
5 - 8 yrs
₹6L - ₹7L / yr
Relational Database (RDBMS)
NOSQL Databases
MySQL
MS SQLServer
SQL server
+13 more

Database Architect

5 - 6 Years

Good Knowledge in Relation and Non-Relational Database

To write Complex Queries and Identify problematic queries and provide a Solution

Good Hands on database tools

Experience in Both SQL and NON SQL Database like SQL Server, PostgreSQL, Mango DB, Maria DB. Etc.

Worked on Data Model Preparation & Structuring Database etc.

Read more
Egnyte
at Egnyte
4 recruiters
Prasanth Mulleti
Posted by Prasanth Mulleti
Remote only
3 - 6 yrs
₹10L - ₹20L / yr
skill iconData Analytics
Salesforce
Customer Relationship Management (CRM)
MS-Office

Data Analyst

 

The Data Analyst supports business stewards in the implementation of and best practices for data quality concepts and tools. He/she will be accountable for providing and implementing end-to-end solutions for large complex projects and executing data quality processes. This role operationally supports the larger sales and marketing operations in the organization and partners with the business ops team and other business partners to analyze, troubleshoot and resolve issues through troubleshooting and root cause analysis. The Data Analyst be responsible to conduct extensive research on Web portals to capture various information relevant to the business. The analyst will also build reports and data sets to monitor data quality to ensure crisp data is available to drive decision-making. As a key member of the Business Operations team, the Data Analyst will be responsible for the quality and integrity of the data housed in core business systems, including Salesforce.com and Marketo, among others

What You’ll Do (but is not limited to)

  • Raising data quality and performance levels in our organization, especially with respect to critical data elements, including accounts, contacts, leads, opportunities, and closed business.
  • Establishing and maintaining a set of processes and controls aimed at ensuring data quality and integrity issues are surfaced and addressed on an ongoing basis
  • Creating and publishing data hygiene reports so that the broader organization has visibility to the quality and integrity of the data at any given point in time as well as on an ongoing basis
  • Ensuring best in class data sources are leveraged to enrich the quality of the data in our core systems
  • Working closely with other members of the Business Operations team to surface and address data quality issues
  • Ensuring that the data leveraged in core operational reporting is accurate and complete at all times
  • Validating and processing data import inquiries
  • Identifying, investigating, and managing the resolution and escalation of data quality issues
  • Coordinating regular and ongoing data quality audits
  • Coordinating various data cleanup tasks which include: deduping account and contact data, ensuring data enrichment via our standard tools is performed consistently and correctly, overseeing the SFDC account creation process to prevent duplicates and ensure all internal requirements are met

 

Your Qualifications 

  • BA/BS degree or equivalent practical experience
  • 1-year minimum in a data management role
  • Extensive research Capabilities in Web Portals
  • Knowledge Required on MS office especially Excel
  • Hands on CRM Applications, Salesforce Preferred
  • Knowledge on Duns and Bradstreet Hierarchy
  • Fluency in English , Verbal as well as written
  • Having aptitude for computer systems and software and have strong analytical abilities.
  • Ability to manage multiple priorities
  • Logical thinker with superb interpersonal skills
  • Ability to work well under pressure
  • Organized and detail-oriented
  • Comfortable with ambiguity
  • Customer service oriented
  • Great teammate
  • Salesforce experience preferred

 

About Egnyte

In a content critical age, Egnyte fuels business growth by enabling content-rich business processes, while also providing organizations with visibility and control over their content assets. Egnyte’s cloud-native content services platform leverages the industry’s leading content intelligence engine to deliver a simple, secure, and vendor-neutral foundation for managing enterprise content across business applications and storage repositories. More than 16,000 customers trust Egnyte to enhance employee productivity, automate data management, and reduce file-sharing cost and complexity. Investors include Google Ventures, Kleiner Perkins, Caufield & Byers, and Goldman Sachs. For more information, visithttp://www.egnyte.com/">www.egnyte.com

Read more
Archwell
Agency job
via AVI Consulting LLP by Sravanthi Puppala
Mysore
2 - 8 yrs
₹1L - ₹15L / yr
Snowflake
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
Windows Azure
+6 more

Title:  Data Engineer – Snowflake

 

Location: Mysore (Hybrid model)

Exp-2-8 yrs

Type: Full Time

Walk-in date: 25th Jan 2023 @Mysore 

 

Job Role: We are looking for an experienced Snowflake developer to join our team as a Data Engineer who will work as part of a team to help design and develop data-driven solutions that deliver insights to the business. The ideal candidate is a data pipeline builder and data wrangler who enjoys building data-driven systems that drive analytical solutions and building them from the ground up. You will be responsible for building and optimizing our data as well as building automated processes for production jobs. You will support our software developers, database architects, data analysts and data scientists on data initiatives

 

Key Roles & Responsibilities:

  • Use advanced complex Snowflake/Python and SQL to extract data from source systems for ingestion into a data pipeline.
  • Design, develop and deploy scalable and efficient data pipelines.
  • Analyze and assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements. For example: automating manual processes, optimizing data delivery, re-designing data platform infrastructure for greater scalability.
  • Build required infrastructure for optimal extraction, loading, and transformation (ELT) of data from various data sources using AWS and Snowflake leveraging Python or SQL technologies.
  • Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
  • Create and configure appropriate cloud resources to meet the needs of the end users.
  • As needed, document topology, processes, and solution architecture.
  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies

 

Qualifications & Experience

Qualification & Experience Requirements:

  • Bachelor's degree in computer science, computer engineering, or a related field.
  • 2-8 years of experience working with Snowflake
  • 2+ years of experience with the AWS services.
  • Candidate should able to write the stored procedure and function in Snowflake.
  • At least 2 years’ experience in snowflake developer.
  • Strong SQL Knowledge.
  • Data injection in snowflake using Snowflake procedure.
  • ETL Experience is Must (Could be any tool)
  • Candidate should be aware of snowflake architecture.
  • Worked on the Migration project
  • DW Concept (Optional)
  • Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
  • Experience with data pipeline and workflow management tools: Airflow, etc.
  • Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources
  • Experience working with Linux and UNIX environments.
  • Experience with profiling data, with and without data definition documentation
  • Familiar with Git
  • Familiar with issue tracking systems like JIRA (Project Management Tool) or Trello.
  • Experience working in an agile environment.

Desired Skills:

  • Experience in Snowflake. Must be willing to be Snowflake certified in the first 3 months of employment.
  • Experience with a stream-processing system: Snowpipe
  • Working knowledge of AWS or Azure
  • Experience in migrating from on-prem to cloud systems
Read more
Open Eyes software solutions pvt
Vadodara
2 - 7 yrs
₹2L - ₹7L / yr
skill iconAmazon Web Services (AWS)
skill iconMachine Learning (ML)
skill iconPython
API
Algorithms
+2 more
Key Skills
o Convert machine learning models into APIs for applications accessibility
o Running machine learning tests and experiments
o Implementing appropriate ML algorithms
o Creating machine learning models and retraining systems
o Study and transform data science prototypes
o Design machine learning systems
o Research and implement appropriate ML algorithms and tools
o Train and retrain systems when necessary
o Test and deploy models
o Use AI to empower the company with novel capabilities
o Designing and developing machine learning and deep learning system
o Outstanding analytical and problem-solving skills
• Alexa
o Excellent in Python programming
o Experience with AWS Lamda
o Experience with Alexa skills
o Alexa skill directives
• Google
o Excellent in NodeJS programming
o Experience with GCP - Dialog Flow and Actions on Google
o Using built-in intents and developing custom intents
o API integration and Postman knowledge
Read more
Velocity Services
at Velocity Services
2 recruiters
Newali Hazarika
Posted by Newali Hazarika
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data engineering
Oracle
+7 more

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.

 

We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.

 

Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!

 

Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc. 

 

Key Responsibilities

  • Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality

  • Work with the Office of the CTO as an active member of our architecture guild

  • Writing pipelines to consume the data from multiple sources

  • Writing a data transformation layer using DBT to transform millions of data into data warehouses.

  • Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities

  • Identify downstream implications of data loads/migration (e.g., data quality, regulatory)

 

What To Bring

  • 5+ years of software development experience, a startup experience is a plus.

  • Past experience of working with Airflow and DBT is preferred

  • 5+ years of experience working in any backend programming language. 

  • Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL

  • Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)

  • Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects

  • Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure

  • Basic understanding of Kubernetes & docker is a must.

  • Experience in data processing (ETL, ELT) and/or cloud-based platforms

  • Working proficiency and communication skills in verbal and written English.

 

Read more
Technology service company
Remote only
5 - 10 yrs
₹10L - ₹20L / yr
Relational Database (RDBMS)
NOSQL Databases
NOSQL
Performance tuning
SQL
+10 more

Preferred Education & Experience:

  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics or related technical field or equivalent practical experience. Relevant experience of at least 3 years in lieu of above if from a different stream of education.

  • Well-versed in and 5+ years of hands-on demonstrable experience with:
    ▪ Data Analysis & Data Modeling
    ▪ Database Design & Implementation
    ▪ Database Performance Tuning & Optimization
    ▪ PL/pgSQL & SQL

  • 5+ years of hands-on development experience in Relational Database (PostgreSQL/SQL Server/Oracle).

  • 5+ years of hands-on development experience in SQL, PL/PgSQL, including stored procedures, functions, triggers, and views.

  • Hands-on experience with demonstrable working experience in Database Design Principles, SQL Query Optimization Techniques, Index Management, Integrity Checks, Statistics, and Isolation levels

  • Hands-on experience with demonstrable working experience in Database Read & Write Performance Tuning & Optimization.

  • Knowledge and Experience working in Domain Driven Design (DDD) Concepts, Object Oriented Programming System (OOPS) Concepts, Cloud Architecture Concepts, NoSQL Database Concepts are added values

  • Knowledge and working experience in Oil & Gas, Financial, & Automotive Domains is a plus

  • Hands-on development experience in one or more NoSQL data stores such as Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc. a plus.

Read more
Consulting Leader
Pune, Mumbai
8 - 10 yrs
₹8L - ₹16L / yr
Data integration
talend
Hadoop
Integration
skill iconJava
+1 more

 

Job Description for :

Role: Data/Integration Architect

Experience – 8-10 Years

Notice Period: Under 30 days

Key Responsibilities: Designing, Developing frameworks for batch and real time jobs on Talend. Leading migration of these jobs from Mulesoft to Talend, maintaining best practices for the team, conducting code reviews and demos.

Core Skillsets:

Talend Data Fabric - Application, API Integration, Data Integration. Knowledge on Talend Management Cloud, deployment and scheduling of jobs using TMC or Autosys.

Programming Languages - Python/Java
Databases: SQL Server, Other Databases, Hadoop

Should have worked on Agile

Sound communication skills

Should be open to learning new technologies based on business needs on the job

Additional Skills:

Awareness of other data/integration platforms like Mulesoft, Camel

Awareness Hadoop, Snowflake, S3

Read more
Mobile Programming India Pvt Ltd
at Mobile Programming India Pvt Ltd
1 video
17 recruiters
Pawan Tiwari
Posted by Pawan Tiwari
Remote, Bengaluru (Bangalore), Gurugram, Chennai, Pune, Mohali, Dehradun
4 - 8 yrs
₹6L - ₹11L / yr
Oracle NoSQL Database
Oracle Database
Data engineering
Big Data
Oracle
+1 more

Looking Data Enginner for our Own organization-

Notice Period- 15-30 days
CTC- upto 15 lpa

 

Preferred Technical Expertise 

  1. Expertise in Python programming.
  2. Proficient in Pandas/Numpy Libraries. 
  3. Experience with Django framework and API Development.
  4. Proficient in writing complex queries using SQL
  5. Hands on experience with Apache Airflow.
  6. Experience with source code versioning tools such as GIT, Bitbucket etc.

 Good to have Skills:

  1. Create and maintain Optimal Data Pipeline Architecture
  2. Experienced in handling large structured data.
  3. Demonstrated ability in solutions covering data ingestion, data cleansing, ETL, Data mart creation and exposing data for consumers.
  4. Experience with any cloud platform (GCP is a plus)
  5. Experience with JQuery, HTML, Javascript, CSS is a plus.
If Intersted , Kindly share Your CV
Read more
Chennai, Bengaluru (Bangalore), Hyderabad
4 - 10 yrs
₹9L - ₹20L / yr
Informatica
informatica developer
Informatica MDM
Data integration
Informatica Data Quality
+7 more
  • Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
  • Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
  • Experience with the SIF framework including real-time integration
  • Should have experience in building C360 Insights using Informatica
  • Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
  • Should have experience in building different data warehouse architecture like Enterprise,
  • Federated, and Multi-Tier architecture.
  • Should have experience in configuring Informatica Data Director in reference to the Data
  • Governance of users, IT Managers, and Data Stewards.
  • Should have good knowledge in developing complex PL/SQL queries.
  • Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
  • Should know about Informatica Server installation and knowledge on the Administration console.
  • Working experience with Developer with Administration is added knowledge.
  • Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
  • Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos