16+ Data integration Jobs in India
Apply to 16+ Data integration Jobs on CutShort.io. Find your next job, effortlessly. Browse Data integration Jobs and apply today!
Position Overview: We are looking for an experienced and highly skilled Data Architect to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Architect, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management
Key Responsibilities:
• Customer Collaboration:
– Partner with clients to gather and understand their business
requirements, translating them into actionable technical specifications.
– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.
•Data Modeling & Integration:
– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.
– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.
– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems
• Data Processing & Optimization:
– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.
– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.
• Data Governance & Security:
–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).
–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.
• Cross-Functional Collaboration:
– Work closely with data engineers, data scientists, and business
analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.
– Foster collaboration across teams to streamline data workflows and optimize solution delivery.
• Leveraging Advanced Technologies:
– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide
smart, data-driven solutions to business challenges.
– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.
• Cost Optimization:
–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.
–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.
Qualifications:
• Experience:
– Proven experience (5+ years) as a Data Architect or similar role, designing and implementing data solutions at scale.
– Strong expertise in data modelling, data integration (ETL), and data transformation processes.
– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).
• Technical Skills:
– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache
NiFi, Talend).
– Strong understanding of data security protocols, privacy regulations, and compliance requirements.
– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).
• AI & Machine Learning Exposure:
– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.
–Ability to apply advanced algorithms and automation techniques to improve business processes.
• Soft Skills:
– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.
– Strong problem-solving ability with a customer-centric approach to solution design.
– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.
• Education:
– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).
LIFE AT FOUNTANE:
- Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
- Competitive pay
- Health insurance for spouses, kids, and parents.
- PF/ESI or equivalent
- Individual/team bonuses
- Employee stock ownership plan
- Fun/challenging variety of projects/industries
- Flexible workplace policy - remote/physical
- Flat organization - no micromanagement
- Individual contribution - set your deadlines
- Above all - culture that helps you grow exponentially!
A LITTLE BIT ABOUT THE COMPANY:
Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.
We’re a team of 80+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.
NASDAQ listed, Service Provider IT Company
Job Summary:
As a Cloud Architect at organization, you will play a pivotal role in designing, implementing, and maintaining our multi-cloud infrastructure. You will work closely with various teams to ensure our cloud solutions are scalable, secure, and efficient across different cloud providers. Your expertise in multi-cloud strategies, database management, and microservices architecture will be essential to our success.
Key Responsibilities:
- Design and implement scalable, secure, and high-performance cloud architectures across multiple cloud platforms (AWS, Azure, Google Cloud Platform).
- Lead and manage cloud migration projects, ensuring seamless transitions between on-premises and cloud environments.
- Develop and maintain cloud-native solutions leveraging services from various cloud providers.
- Architect and deploy microservices using REST, GraphQL to support our application development needs.
- Collaborate with DevOps and development teams to ensure best practices in continuous integration and deployment (CI/CD).
- Provide guidance on database architecture, including relational and NoSQL databases, ensuring optimal performance and security.
- Implement robust security practices and policies to protect cloud environments and data.
- Design and implement data management strategies, including data governance, data integration, and data security.
- Stay-up-to-date with the latest industry trends and emerging technologies to drive continuous improvement and innovation.
- Troubleshoot and resolve cloud infrastructure issues, ensuring high availability and reliability.
- Optimize cost and performance across different cloud environments.
Qualifications/ Experience & Skills Required:
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Experience: 10 - 15 Years
- Proven experience as a Cloud Architect or in a similar role, with a strong focus on multi-cloud environments.
- Expertise in cloud migration projects, both lift-and-shift and greenfield implementations.
- Strong knowledge of cloud-native solutions and microservices architecture.
- Proficiency in using GraphQL for designing and implementing APIs.
- Solid understanding of database technologies, including SQL, NoSQL, and cloud-based database solutions.
- Experience with DevOps practices and tools, including CI/CD pipelines.
- Excellent problem-solving skills and ability to troubleshoot complex issues.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Deep understanding of cloud security practices and data protection regulations (e.g., GDPR, HIPAA).
- Experience with data management, including data governance, data integration, and data security.
Preferred Skills:
- Certifications in multiple cloud platforms (e.g., AWS Certified Solutions Architect, Google Certified Professional Cloud Architect, Microsoft Certified: Azure Solutions Architect).
- Experience with containerization technologies (Docker, Kubernetes).
- Familiarity with cloud cost management and optimization tools.
AWS Glue Developer
Work Experience: 6 to 8 Years
Work Location: Noida, Bangalore, Chennai & Hyderabad
Must Have Skills: AWS Glue, DMS, SQL, Python, PySpark, Data integrations and Data Ops,
Job Reference ID:BT/F21/IND
Job Description:
Design, build and configure applications to meet business process and application requirements.
Responsibilities:
7 years of work experience with ETL, Data Modelling, and Data Architecture Proficient in ETL optimization, designing, coding, and tuning big data processes using Pyspark Extensive experience to build data platforms on AWS using core AWS services Step function, EMR, Lambda, Glue and Athena, Redshift, Postgres, RDS etc and design/develop data engineering solutions. Orchestrate using Airflow.
Technical Experience:
Hands-on experience on developing Data platform and its components Data Lake, cloud Datawarehouse, APIs, Batch and streaming data pipeline Experience with building data pipelines and applications to stream and process large datasets at low latencies.
➢ Enhancements, new development, defect resolution and production support of Big data ETL development using AWS native services.
➢ Create data pipeline architecture by designing and implementing data ingestion solutions.
➢ Integrate data sets using AWS services such as Glue, Lambda functions/ Airflow.
➢ Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Athena.
➢ Author ETL processes using Python, Pyspark.
➢ Build Redshift Spectrum direct transformations and data modelling using data in S3.
➢ ETL process monitoring using CloudWatch events.
➢ You will be working in collaboration with other teams. Good communication must.
➢ Must have experience in using AWS services API, AWS CLI and SDK
Professional Attributes:
➢ Experience operating very large data warehouses or data lakes Expert-level skills in writing and optimizing SQL Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technology.
➢ Must have 6+ years of big data ETL experience using Python, S3, Lambda, Dynamo DB, Athena, Glue in AWS environment.
➢ Expertise in S3, RDS, Redshift, Kinesis, EC2 clusters highly desired.
Qualification:
➢ Degree in Computer Science, Computer Engineering or equivalent.
Salary: Commensurate with experience and demonstrated competence
About us: EITACIES Inc is a Product Development and IT Services company, providing pioneer services in Digital transformation, Cloud & Cyber security, DevSecops, AI & ML, Business Intelligence and Enterprise Integration. We have been supporting multiple bay area start-ups and Fortune 500 companies in different industry verticals since 2008. For more information please visit www.eitacies.com
We are looking for a Automation Software Engineer to join the core team that is building our latest cloud security product - Prisma SaaS. This fast-growing cloud service provides next- generation security for enterprise SaaS applications such as Box, Dropbox, GitHub, Google Apps, Slack, Salesforce and many more. Prisma SaaS enables organizations to store terabytes of sensitive data in these applications while preventing any security threats to their cloud. This role will also give you a unique opportunity to collaborate with many of our supported SaaS vendors and build skills to influence every aspect of delivering an enterprise-class cloud security service.
Bring your backend java cloud engineering skills to work on the latest cloud software/web applications. Help us deploy and scale the next generation of cloud security utilizing big data. Keys to Success: Experience building Micro Service, REST API, Big Query, Elastic-search, and AWS SQS.
With a great fit having:
● Strong expertise with React
● Strong expertise with RSpec
● Strong experience with Python
● Experience in End to End Testing automation for UI
● Experience in Unit/Integration testing of APIs
● Strong expertise with MongoDB and ElasticSearch
● Working knowledge of Docker
● Strong experience with open source tooling
● Bachelor’s Degree and/or equivalent relevant experience in a technical field
Your Impact:
- Responsible for complete software development process including End-to-End Automation for UI
- Write clean, testable, readable, scalable and maintainable code that scales and performs well for thousands of customers.
- Participate actively and contribute to design and development discussions.
- Develop proven understanding and be able to explain advanced Cloud Computing and Cloud Security concepts to others
Your Experience:
● Strong expertise in latest UI technology
● Strong Experience in Python
● Collaborate with UX/UI designers and product designers to build user-friendly, immersive, reactive applications
● 5+ years of software data integration experience.
● 5+ years of Javascript, HTML, and CSS.
● 5+ Working knowledge of React, Angular or an equivalent MVC framework.
● 5+ Experience with the API toolset: REST, HTTP, GraphQL, JSON, XML, Postman, etc.
● Experience in SQL, MongoDB, ElasticSearch.
● Experience with Git, Continuous Integration, and Continuous Delivery mechanisms
● Experience with RSpec.
● Big Plus if you have CASB or general SaaS application experience.
● Big plus if you have experience with Data Security application.
The notice period is 30 days or less.
In this role, you will:
As part of a team focused on the preserving the customer experience across the organization, this Analytic Consultant will be responsible for:
- Understand business objectives and provide credible challenge to analysis requirements.
- Verify sound analysis practices and data decisions were leveraged throughout planning and data sourcing phases.
- Conduct in-depth research within complex data environments to identify data integrity issues and propose solutions to improve analysis accuracy.
- Applying critical evaluation to challenge assumptions, formulate a defendable hypothesis, and ensuring high quality analysis results.
- Ensure adherence to data management/ data governance regulations and policies.
- Performing and testing highly complex data analytics for customer remediation.
- Designing analysis projects flow and documentation that is structured for consistency, easy to understand, and to be offered to multiple levels of reviewers, partners, and regulatory agents demonstrating research and analysis completed.
- Investigate and ensure data integrity from multiple sources.
- Ensure data recommended and used is the best “source of truth”.
- Applies knowledge of business, customers, and products to synthesize data to 'form a story' and align information to contrast/compare to industry perspective. Data involved typically very large, structured or unstructured, and from multiple sources.
- Must have a strong attention to detail and be able to meet high quality standards consistently.
- Other duties as assigned by manager.
- Willing to assist on high priority work outside of regular business hours or weekend as needed.
Essential Qualifications:
- Around 5+ years in similar analytics roles
- Bachelors, M.A./M.Sc. College Degree or Higher in applied mathematics, statistics, engineering, physics, accounting, finance, economics, econometrics, computer sciences, or business/social and behavioral sciences with a quantitative emphasis.
- Preferred programming knowledge SQL/SAS
- Knowledge of PVSI, Non-Lending, Student Loans, Small Business and Personal Lines and Loans is a plus.
- Strong experience with data integration, database structures and data warehouses.
- Persuasive written and verbal communication skills.
Desired Qualifications:
- Certifications in Data Science, or BI Reporting tools.
- Ability to prioritize work, meet deadlines, achieve goals and work under pressure in a dynamic and complex environment – Soft Skills.
- Detail oriented, results driven, and has the ability to navigate in a quickly changing and high demand environment while balancing multiple priorities.
- Ability to research and report on a variety of issues using problem solving skills.
- Ability to act with integrity and a high level of professionalism with all levels of team members and management.
- Ability to make timely and independent judgment decisions while working in a fast-paced and results-driven environment.
- Ability to learn the business aspects quickly, multitask and prioritize between projects.
- Exhibits appropriate sense of urgency in managing responsibilities.
- Ability to accurately process high volumes of work within established deadlines.
- Available to flex schedule periodically based on business need.
- Demonstrate strong negotiation, communication & presentation skills.
- Demonstrates a high degree of reliability, integrity and trustworthiness.
- Takes ownership of assignments and helps drive assignments of the team.
- Dedicated, enthusiastic, driven and performance-oriented; possesses a strong work ethic and good team player.
- Be proactive and get engaged in organizational initiatives.
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.
Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.
What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.
Qualifications & Experience relevant for the role
• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).
• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Global Media Agency - A client of Merito
Job Description
- Lead the Data Solutions function and be the POC for all Agency Digital and Data leads to liaise on client requirements, and ensuring the appropriate solutions are rolled out.
- General agency pitch support for data as needed but more proactively creating consulting frameworks around client 1st PD data architecture, guidance, and recommendation around CDPs implementation, vs DMP, Mar-tech stack
- Establish working relationship and SOP for implementation delivery for Mar-Tech projects
- Contribute to team projects as the executive sponsor for client data initiatives, support the agencies in audience planning strategies utilising all data assets
- Own the setup elements such as first party data, tag integration (tag management systems, custom data integrations, CRM connections, etc.), custom data partner integration development.
- Work with data partners to integrate their data sets into [m]insights (Onboarding & Testing Data Sets)
- Manage the Mar-tech partners, working on JBPs to achieve shared goals of client enablement
- Work closely with the global and regional product team to test and rollout new features, special with the cookieless future of audience planning tools
Requirements
- 10+ years of experience in Technology Consulting role within Adtech/Martech
- Bachelor’s or master’s degree in a relevant field (Data, Technology, IT, Marketing)
- Certifications (or advanced knowledge) in DMP, CDP, Mar-tech & digital analytics platforms
- A strong understanding of all digital marketing channels
- Familiar with advanced analytics and data management concepts
- Proven ability to build and grow relationships with key stakeholders
- Excellent written and verbal communication skills.
- Flexibility to work in a cross-functional team but also have the initiative to problem-solve independently.
- Highly organised with demonstrated project management capabilities
Job Description for :
Role: Data/Integration Architect
Experience – 8-10 Years
Notice Period: Under 30 days
Key Responsibilities: Designing, Developing frameworks for batch and real time jobs on Talend. Leading migration of these jobs from Mulesoft to Talend, maintaining best practices for the team, conducting code reviews and demos.
Core Skillsets:
Talend Data Fabric - Application, API Integration, Data Integration. Knowledge on Talend Management Cloud, deployment and scheduling of jobs using TMC or Autosys.
Programming Languages - Python/Java
Databases: SQL Server, Other Databases, Hadoop
Should have worked on Agile
Sound communication skills
Should be open to learning new technologies based on business needs on the job
Additional Skills:
Awareness of other data/integration platforms like Mulesoft, Camel
Awareness Hadoop, Snowflake, S3
A global business process management company
Designation – Deputy Manager - TS
Job Description
- Total of 8/9 years of development experience Data Engineering . B1/BII role
- Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
- Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
- Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
- Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
- Strong Python skill .
- Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
- Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
- Life Science & Healthcare domain background will be a plus
Qualifications
BE/Btect/ME/MTech
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Several years of experience in designing web applications
- 4-7 years of Industry experience in IT or consulting organizations
- 3+ years of experience defining and delivering Informatica Cloud Data Integration & Application Integration enterprise applications in lead developer role
- Must have working knowledge on integrating with Salesforce, Oracle DB, JIRA Cloud
- Must have working scripting knowledge (windows or Nodejs)
Soft Skills
- Superb interpersonal skills, both written and verbal, in order to effectively develop materials that are appropriate for variety of audience in business & technical teams
- Strong presentation skills, successfully present and defend point of view to Business & IT audiences
- Excellent analysis skills and ability to rapidly learn and take advantage of new concepts, business models, and technologies
The Opportunity
As a Pre-Sales Engineer, you will be responsible for technical account leadership and management for the Denodo Sales Cycle across multiple strategic accounts, including relationships with technical counterparts within those organizations. The selected candidate will be working closely with experienced Sales, Marketing and Product Management personnel and will be supported by a strong technical team in an ideal, fast paced, rapidly growing startup environment to grow professionally and go beyond expectations.
Your career with us will combine cutting edge technology, exposure to worldwide clients across all industries (Financial Services, Automotive, Insurance, Pharma, etc.) and an exciting growth path as part of a global team.
Duties & Responsibilities
As a Sales Engineer you will successfully employ a combination of high technical expertise, client communication and coordination skills between clients and internal Denodo teams to achieve your mission.
Product and Technical Knowledge:
-
Obtain and maintain strong knowledge of the Denodo Platform, be able to deliver a superb demo and technical pitch, including overview of our key and advanced features and benefits, services offerings, differentiation, and competitive positioning
-
Be able to address a majority of technical questions concerning customization, integration, enterprise architecture and general feature / functionality of our platform
-
Capable of building and/or leading the development of custom demos and PoCs based and beyond customer requirements
-
Provide timely, prioritized and complete customer-based feedback to Product Management, Sales, Support and/or Development regarding customer requirements and issues.
Presentation and organizational skills:
-
Train and engage customers in the product architecture, configuration, and use of the Denodo Platform
-
Know where to escalate within the Denodo technical organization, and make effective use of those resources.
Customer engagement:
-
Enthusiastically present software solutions to prospective customers, from IT managers and technicians to C-level executives
-
Lead the completion of RFI and RFP
-
Manage client expectations, establish credibility at all levels within the client and build problem-solving partnerships with the client and colleagues
-
Along with Marketing, Sales, Sales Engineering and Product Management, and based on market or customer feedback, lead the definition, development and presentation of product and solution demos and PoCs for prospects and partners
-
Provide technical support to entire sales team as appropriate (e.g. handle technical questions and escalate them when required, conduct training and/or briefings)
-
Document and track all activity through CRM and an internal wiki, including lead/prospect data entry, detailed activity reports, setting tasks for follow-up, architectures, challenges, etc.
Qualifications
Desired Skills & Experience
- BS or higher degree in Computer Science
- 5+ years as a Sales/Solution Engineer, Consultant or Enterprise Architect
- Must have excellent verbal and written communication skills to be able to interact with technical and business counterparts
- Knowledge and experience on Enterprise Architecture Best Practices and Frameworks will be a plus
- A “can do” attitude
- Willingness to travel
Hands-on working experience in any of the following areas:
- Solid understanding of SQL and a good grasp of relational and analytical database management theory and practice. Understanding and optimization of complex queries.
- Understanding of Data Integration flavors (ETL/Data Warehousing, ESB/Messaging, DV).
- Understanding of BigData, NoSQL, in-memory, MPP environments is welcome.
- Technical skills include JDBC/ODBC, XML, JSON, REST APIs, Java development and version control systems. Good knowledge of software development and architectural patterns.
If you are passionate about your data and analytics, love integrating SaaS platforms, and enjoy talking to customers, you'll have a great time here at CutsomerSuccessBox! We recognize both the opportunities and the challenges that come along.
We're looking for a great Implementation Engineer to assist our customers in integrating with 3rd party tools, help them get the data via APIs, JavaScripts or other analytics tools so that the customers can start getting value from CustomerSuccessBox (CSB) product. This is a great role for someone who loves data & to build their career in Customer Success domain & work with the world-class team.
Implementation Engineer’s role is a key part of the Product & Customer Success team. As an Implementation Engineer, you will be responsible for guiding customers in getting the data from various 3rd party tools & make sure you are helping the Customer Success team to onboard customers. You will act as a trusted advisor. You will become an integral part of the company and our customers.
Your Primary Responsibilities
- Understand Customer Operations Tech Stack. What platforms are they using for CRM, Support, Billing, Analytics, etc)
- Handhold customer in syncing all the data sources with CSB. Many of which are pre build native integrations which are activated by a simple click through flow.
- Devising smart engineering solutions for each customer's unique enterprise stack (CRM/Billing/Support/Analytics).
- You will get opportunities to learn & develop on different technologies and platforms.
- Understand the business requirement as to prioritize data sources and objects.
- Conduct workshops with customers to break the requirement in smaller & achievable format.
- Create an implementation framework to capture and decide what are all the data that is required
- Plan meetings with customers & their developers to understand how their data is structured. Consult them the best practices of sending that data to CSB.
- You will always be connected to the CSB technical team, so that you can leverage them to drive customer implementation.
- Resolve customer issues, with or without collaboration with other teams;
- Build strong customer relationships by maintaining high levels of engagement and communication.
- Collaborate on creating best practices guide for developers, Update API documentation & update & maintain the Integration documentation.
- Work with CustomerSuccess Team to share insights with customers & prepare Customised reports on BI tools
- Pass Feedback to Product Management
- Work with the Support & CustomerSuccess team to coordinate and prioritize open items.
What experience will you gain?
- Tech – Learning about huge enterprises tech stack
- Tech – Understanding enterprise technical architecture and implementation
- Tech - Understanding of APIs/JS/analytics
- Business / Tech - Use of enterprise platforms like Business Intelligence (BI) / SaaS platforms like CRM (Salesforce, Hubspot), support (Zendesk, Intercom, Freshdesk), Billing (Chargebee, Stripe, Quickbooks), etc.
- International Customer exposure – Talking to industry leaders
What Makes You The Right Fit
- Any experinece where you have done some Customer facing role.
- Is hands on with BI tools. (good to have)
- Has an understanding of the B2B SaaS ecosystem & tools usually companies use.
- Strong communication and interpersonal skills, both written and verbal; This is a must have since we have an international customer base that you will be working with.
- Undergraduate degree (BE/B.Tech) in Computer Science or Information Technology
What This Is Not
- This is NOT a 9 to 5 job.
- We don't offer or encourage any hierarchical structure.
Compensation
- Among the best in the industry.
- Cash, Incentives
Why CustomerSuccessBox?
CustomerSuccessBox is the leading AI powered Enterprise Customer Success platform. It helps global B2B SaaS businesses to get to 130%+ MRR Retention, unlock 3X upsell opportunities and double LTV.
In SaaS, 95% of the Lifetime value (LTV) is locked in as ‘Future’ Recurring Revenue. BUT businesses don't know who will Renew / Buy-more / At-Risk until it’s too LATE.
CustomerSuccessBox technology tracks real time product adoption and customer engagement. It’s AI Powered ‘Early Warning System’ uncovers the blind spots for Proactive Interventions. It's advanced automation leads to a faster time to value, higher product adoption and reduced cost of customer success.
Leading customer-centric global enterprises such as Headset - Seattle, Aislelabs - Toronto, Locus - Bangalore and Raken - San Diego.
People and Culture
CustomerSuccessBox is a B2B SaaS customer success platform, Venture backed (https://inc42.com/buzz/customersuccessbox-funding-pi-ventures-axilor/) by pi Ventures and Axilor in venture funding. CustomerSuccessBox is founded and run by CEO, Puneet Kataria, recognized by LinkedIn as a Global Top Voice of 2019.
At CustomerSuccessBox, you will become part of a passionate team that works together to create value for our customers and their customers.
As a member of the CustomerSuccessBox team, you will own the problems and will enjoy both authority and accountability over deliverables. You will be rewarded for business results delivered. All the credit of the success is yours to take, but it will come with the responsibility of owning up to the failures too, we call them ‘failed experiments’.
In return, we are committed to providing you with every opportunity to learn, grow and reach the highest level of your ability and potential.
CustomerSuccessBox is an equal opportunity employer. All candidates for employment will be considered without regard to race, color, religion, gender, ethnicity, national origin, disability, or sexual orientation.
15 years US based Product Company
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Do you want to work with the company which solves real time challenges by using Artificial Intelligence, Data Integration & Analytics, then read on.
Our client is an Ad Exchange startup redefining health care marketing.The startup has built an integrated marketplace for advertising to health practictioners by allowing them to access multiple digital health care platforms via single window. They have developed a patented technology that allows them to do precision targeting of ads for the physicians using programmatic media. They are soon to expand their services in the USA.
The founder is a qualified physician an innovator at heart. He has immense experience in health management sector and has worked for international healthcare organizations.
- Design and Execute Tests - Designing automated tests to validate applications by creating scripts that run testing functions automatically. This includes determining priority for test scenarios and creating execution plans to implement these scenarios.
- Identify and Report Bugs - Analysing bug reports and highlighting problems to help identify fixes for them. Delivering regular reports identifying these bugs to other members of the engineering team.
- Install Databases and Apps - Installing and setting up databases and backup applications to prevent errors and protect against data loss.
- Identify Quality Issues - Analysing systems to identify potential quality issues that could affect apps.
- Collaborate - Collaborating with other members of the engineering team to find the best methods for solving problems in apps and systems.
What you need to have:
- Bachelor’s degree
- 4+ years of work experience
- Understanding of adtech platforms is a big plus
- Diligent work ethic. Must be self-motivated and able to take the initiative to get the job done
- Programming – programming skills to write computer code and scripts in common computer languages, such as VBScript, Java, Nodejs etc
- Analytical skills – to examine bug reports, prioritize necessary tests, and streamline application functions through automated testing processes
- Problem-solving skills – to find bugs and create fixes for them
- Attention to detail – to test web and mobile applications to find ways to improve them and isolate problems
- Communication skills – need strong verbal communication skills to effectively collaborate with the engineering team and create written reports showing errors and testing plans