20+ Data architecture Jobs in India
Apply to 20+ Data architecture Jobs on CutShort.io. Find your next job, effortlessly. Browse Data architecture Jobs and apply today!
Client based at Delhi/ NCR and Pune location.
Mandatory Skills- Data Engineer, client Engagement, Project Management, Project Delivery, Team Leadership, Data Governance, Quality Assurance, Business Development, Data Architecture
Additional Skills- Communication Skills, Problem Solving Skills
Job Description
This position requires someone with good problem solving, business understanding and client presence. Overall professional experience of the candidate should be above 8 years. A minimum of 5 years of experience in leading and managing a client portfolio in Data Engineering space. Should have good understanding of business operations, challenges faced, and business technology used across business functions.
The candidate must understand the usage of traditional and modern data Engineering technologies/tools for solving business problems and help clients in their data journey. The candidate must have knowledge of emerging technologies for data management including data governance, data quality, security, data integration, processing, and provisioning. The candidate must possess required soft skills to work with teams and lead medium to large teams.
Candidate should be comfortable with taking leadership roles, in client projects, pre-sales/consulting, solutioning, business development conversations, execution on data engineering projects.
Key Responsibilities:
Client Engagement & Relationship Management:
- Serve as the primary point of contact for clients on data engineering projects, understanding their needs, challenges, and goals.
- Develop and maintain strong client relationships, ensuring high levels of client satisfaction and repeat business.
- Translate client requirements into actionable technical solutions and project plans.
Project Management & Delivery:
- Oversee the delivery of data engineering projects from inception to completion, ensuring projects are delivered on time, within scope, and within budget.
- Manage project resources, timelines, and risks, ensuring smooth project execution and delivery.
- Collaborate with cross-functional teams including data scientists, business analysts, and IT professionals to deliver comprehensive data solutions.
Technical Leadership & Innovation:
- Lead the design, development, and deployment of scalable data architectures, pipelines, and processes tailored to client needs.
- Stay abreast of industry trends, technologies, and best practices, and implement them in client projects to drive innovation and competitive advantage.
- Provide technical oversight and guidance to the data engineering team, ensuring the adoption of best practices and high-quality output.
Team Leadership & Development:
- Lead, mentor, and manage a team of data engineers, fostering a collaborative and high-performance culture.
- Provide professional development opportunities, coaching, and career growth support to team members.
- Ensure the team is equipped with the necessary skills and tools to deliver high-quality consulting services.
Data Governance & Quality Assurance:
- Implement and oversee data governance frameworks, ensuring data integrity, security, and compliance across all client projects.
- Establish and enforce data quality standards, ensuring the reliability and accuracy of data used in client solutions.
- Business Development & Consulting:
- Support business development efforts by contributing to proposals, presenting solutions to prospective clients, and identifying opportunities for expanding client engagements.
- Provide thought leadership in data engineering, contributing to white papers, webinars, and conferences to enhance the company’s reputation in the industry.
Experience candidates should bring
- 8 to 12 years of data engineering experience with at least 3 years in a managerial role within a consulting or professional services environment.
- Proven experience in managing multiple, complex data engineering projects simultaneously.
- Experience in leading a team of 8 to 12 professionals.
- Strong problem-solving skills and the ability to handle complex, ambiguous situations.
- Exceptional project management skills, with experience in Agile methodologies.
- A client-service mindset and a desire to take on tough and challenging projects
- Effective communication skills, both written and verbal
- Ability to work effectively across functions and levels; comfort collaborating with teammates in a virtual environment.
Required Qualification
Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)
Enterprise Data Architect - Dataeconomy (25+ Years Experience)
About Dataeconomy:
Dataeconomy is a rapidly growing company at the forefront of Information Technology. We are driven by data and committed to using it to make better decisions, improve our products, and deliver exceptional value to our customers.
Job Summary:
Dataeconomy seeks a seasoned and strategic Enterprise Data Architect to lead the company's data transformation journey. With 25+ years of experience in data architecture and leadership, you will be pivotal in shaping our data infrastructure, governance, and culture. You will leverage your extensive expertise to build a foundation for future growth and innovation, ensuring our data assets are aligned with business objectives and drive measurable value.
Responsibilities:
Strategic Vision and Leadership:
Lead the creation and execution of a long-term data strategy aligned with the company's overall vision and goals.
Champion a data-driven culture across the organization, fostering cross-functional collaboration and data literacy.
Advise senior leadership on strategic data initiatives and their impact on business performance.
Architecture and Modernization:
Evaluate and modernize the existing data architecture, recommending and implementing innovative solutions.
Design and implement a scalable data lake/warehouse architecture for future growth.
Advocate for and adopt cutting-edge data technologies and best practices.
ETL Tool Experience (8+ years):
Extensive experience in designing, developing, and implementing ETL (Extract, Transform, Load) processes using industry-standard tools such as Informatica PowerCenter, IBM DataStage, Microsoft SSIS, or open-source options like Apache Airflow.
Proven ability to build and maintain complex data pipelines that integrate data from diverse sources, transform it into usable formats, and load it into target systems.
Deep understanding of data quality and cleansing techniques to ensure the accuracy and consistency of data across the organization.
Data Governance and Quality:
Establish and enforce a comprehensive data governance framework ensuring data integrity, consistency, and security.
Develop and implement data quality standards and processes for continuous data improvement.
Oversee the implementation of master data management and data lineage initiatives.
Collaboration and Mentorship:
Mentor and guide data teams, including architects, engineers, and analysts, on data architecture principles and best practices.
Foster a collaborative environment where data insights are readily shared and acted upon across the organization.
Build strong relationships with business stakeholders to understand and translate their data needs into actionable solutions.
Qualifications:
Education: master’s degree in computer science, Information Systems, or related field; Ph.D. preferred.
Experience: 25+ years of experience in data architecture and design, with 10+ years in a leadership role.
Technical Skills:
Deep understanding of TOGAF, AWS, MDM, EDW, Hadoop ecosystem (MapReduce, Hive, HBase, Pig, Flume, Scoop), cloud data platforms (Azure Synapse, Google Big Query), modern data pipelines, streaming analytics, data governance frameworks.
Proficiency in programming languages (Java, Python, SQL), scripting languages (Bash, Python), data modelling tools (ER diagramming software), and BI tools.
Extensive expertise in ETL tools (Informatica PowerCenter, IBM DataStage, Microsoft SSIS, Apache Airflow)
Familiarity with emerging data technologies (AI/ML, blockchain), data security and compliance frameworks.
Soft Skills:
Outstanding communication, collaboration, and leadership skills.
Strategic thinking and problem-solving abilities with a focus on delivering impactful solutions.
Strong analytical and critical thinking skills.
Ability to influence and inspire teams to achieve goals.
- Design and implement effective database solutions and models to store and retrieve company data.
- Examine and identify database structural necessities by evaluating client operations, applications, and programming.
- Assess database implementation procedures to ensure they comply with internal and external regulations.
- Install and organize information systems to guarantee company functionality.
- Prepare accurate database design and architecture reports for management and executive teams.
- Oversee the migration of data from legacy systems to new solutions.
- Monitor the system performance by performing regular tests, troubleshooting, and integrating new features.
- Recommend solutions to improve new and existing database systems.
- Educate staff members through training and individual support.
- Offer support by responding to system problems in a timely manner.
Deep-Rooted.Co is on a mission to get Fresh, Clean, Community (Local farmer) produce from harvest to reach your home with a promise of quality first! Our values are rooted in trust, convenience, and dependability, with a bunch of learning & fun thrown in.
Founded out of Bangalore by Arvind, Avinash, Guru and Santosh, with the support of our Investors Accel, Omnivore & Mayfield, we raised $7.5 million in Seed, Series A and Debt funding till date from investors include ACCEL, Omnivore, Mayfield among others. Our brand Deep-Rooted.Co which was launched in August 2020 was the first of its kind as India’s Fruits & Vegetables (F&V) which is present in Bangalore & Hyderabad and on a journey of expansion to newer cities which will be managed seamlessly through Tech platform that has been designed and built to transform the Agri-Tech sector.
Deep-Rooted.Co is committed to building a diverse and inclusive workplace and is an equal-opportunity employer.
How is this possible? It’s because we work with smart people. We are looking for Engineers in Bangalore to work with thehttps://www.linkedin.com/in/gururajsrao/"> Product Leader (Founder) andhttps://www.linkedin.com/in/sriki77/"> CTO and this is a meaningful project for us and we are sure you will love the project as it touches everyday life and is fun. This will be a virtual consultation.
We want to start the conversation about the project we have for you, but before that, we want to connect with you to know what’s on your mind. Do drop a note sharing your mobile number and letting us know when we can catch up.
Purpose of the role:
* As a startup we have data distributed all across various sources like Excel, Google Sheets, Databases etc. We need swift decision making based a on a lot of data that exists as we grow. You help us bring together all this data and put it in a data model that can be used in business decision making. * Handle nuances of Excel and Google Sheets API. * Pull data in and manage it growth, freshness and correctness. * Transform data in a format that aids easy decision-making for Product, Marketing and Business Heads. * Understand the business problem, solve the same using the technology and take it to production - no hand offs - full path to production is yours.
Technical expertise:
* Good Knowledge And Experience with Programming languages - Java, SQL,Python. * Good Knowledge of Data Warehousing, Data Architecture. * Experience with Data Transformations and ETL; * Experience with API tools and more closed systems like Excel, Google Sheets etc. * Experience AWS Cloud Platform and Lambda * Experience with distributed data processing tools. * Experiences with container-based deployments on cloud.
Skills:
Java, SQL, Python, Data Build Tool, Lambda, HTTP, Rest API, Extract Transform Load.
Strategic Activities · Understand the current technology landscape and identify the critical gaps in terms of the overarching requirements from various core functions (commercial, operations, etc.) · Decide on and design the overall technology strategy and roadmap to fit the organizational need for Air India · Hold the team accountable to drive enterprise level digital transformation projects that align with the overall organizational strategy and different requirements of the functions · Provide guidance and recommendations to the CDTO and senior leadership on different enterprise and data architecture options and make the business case for selecting the appropriate model · Act as an SME of Digital/ Technology Architecture for the senior management Digital and Technology Architecture · Head the digital transformation and innovation projects from idea generation to new ventures in close collaboration with business stakeholders and external partners · Present the vision & value of proposed architectures and solutions to a wide range of audiences in alignment with business priorities and objectives · Plans tasks and estimates for the required research and volume of activities to complete work · Own and assess non-functional requirements and propose solutions for Availability, Backup, Capacity, Performance, Redundancy, Reliability, Scalability, Supportability, Risks and Costs models · Provide strategic guidance to teams on managing third-party service providers in terms of service levels, costing, etc. · Drive the team to ensure appropriate documentation is developed in support of value realization · Lead the technical team and head collaboration between Business Users and software providers to build digital solutions Enterprise Architecture · Ownership of overall Enterprise Architecture including compliances and standards · Head the overall Architecture blueprint and roadmap for Air India applications aligning with Enterprise Architecture · Identify important potential technologies and approaches to address current and future Enterprise needs, evaluating their applicability and fit, as well as leading the definition of standards and best practice for their use Data Architecture · Ownership of overall Data Architecture including compliances and standards · Head the overall Architecture blueprint and roadmap for Air India applications aligning with Data Architecture · Identify important potential technologies and approaches to address current and future Data needs, evaluating their applicability and fit, as well as leading the definition of standards and best practice for their use Team Management · Lead the technical team and head collaboration between Business Users and software providers to build digital solutions · Support, nurture and develop the team members’ skills in all digital transformation and architecture innovation aspect |
Required Skills:
- Proven work experience as an Enterprise / Data / Analytics Architect - Data Platform in HANA XSA, XS, Data Intelligence and SDI
- Can work on new and existing architecture decision in HANA XSA, XS, Data Intelligence and SDI
- Well versed with data architecture principles, software / web application design, API design, UI / UX capabilities, XSA / Cloud foundry architecture
- In-depth understand of database structure (HANA in-memory) principles.
- In-depth understand of ETL solutions and data integration strategy.
- Excellent knowledge of Software and Application design, API, XSA, and microservices concepts
Roles & Responsibilities:
- Advise and ensure compliance of the defined Data Architecture principle.
- Identifies new technologies update and development tools including new release/upgrade/patch as required.
- Analyzes technical risks and advises on risk mitigation strategy.
- Advise and ensures compliance to existing and development required data and reporting standard including naming convention.
The time window is ideally AEST (8 am till 5 pm) which means starting at 3:30 am IST. We understand it can be very early for an SME supporting from India. Hence, we can consider the candidates who can support from at least 7 am IST (earlier is possible).
Opportunity with Largest Conglomerate
Digital and Technology Architecture
- Head the digital transformation and innovation projects from idea generation to new ventures in close collaboration with business stakeholders and external partners
- Present the vision & value of proposed architectures and solutions to a wide range of audiences in alignment with business priorities and objectives
- Plans tasks and estimates for the required research and volume of activities to complete work
- Own and assess non-functional requirements and propose solutions for Availability, Backup, Capacity, Performance, Redundancy, Reliability, Scalability, Supportability, Risks and Costs models
- Provide strategic guidance to teams on managing third-party service providers in terms of service levels, costing, etc.
- Drive the team to ensure appropriate documentation is developed in support of value realization
- Lead the technical team and head collaboration between Business Users and software providers to build digital solutions
Enterprise Architecture
- Ownership of overall Enterprise Architecture including compliances and standards
- Head the overall Architecture blueprint and roadmap for applications aligning with Enterprise Architecture
- Identify important potential technologies and approaches to address current and future Enterprise needs, evaluating their applicability and fit, as well as leading the definition of standards and best practice for their use
Data Architecture
- Ownership of overall Data Architecture including compliances and standards
- Head the overall Architecture blueprint and roadmap applications aligning with Data Architecture
- Identify important potential technologies and approaches to address current and future Data needs, evaluating their applicability and fit, as well as leading the definition of standards and best practice for their use
NET Lead (Need B3) Job Description:
Responsibilities / Expectations
- Tech/team lead requirement in the ICS Simplification domain in the MAAS application.
- 5-8 years of total IT experience.
- At least 4 years in Application Development/Maintenance/Support using .NET Framework
- Should be able to perform migration of legacy applications to Cloud/On Prem by thoroughly understanding the integration and compatibility requirements
- Should be able to Debug and resolve the Application related issues with Migration, and compatibility with the latest Windows/RHEL environment
Skills required
Technical Skills (Must have)
- Strong understanding of .NET Architecture and Compatibility requirements
- Understanding of Data Architecture, and Implementing Databases
- Understanding on Data Migrations, Data Integrations.
- Application & Application security knowledge (certificates/authentication/authorization)
Technical Skills (Good to have)
- knowledge in Cloud resources - Storage, Networking, Security, Identity, Management.
- Experience in Migration of legacy applications to Cloud/On Prem
Soft Skills
- Should interact / communicate effectively with Different domains for application Installation and issue resolution
- Need to interact with other teams related to any integrations with application migration.
- Effective Stakeholder/Customer Management.
- Engaging with necessary stakeholders and SMEs.
- Good Problem Solving skills and approach
- Team handling and Mentoring
- Handling and Minimising Escalations.
Weekly/ fortnightly/ Monthly status to customer & connect with customer.
Who are we looking for?
- Someone who is annoyed by the build time it takes for an application to
build and has actually done something to optimise it
- Has good experience in building Android applications
- Experience with the flutter ecosystem is a great plus.
- Who likes to think in terms of software and data architecture before
opening Android studio.
- Comfortable with managing development and deployment of applications.
- Open and more importantly excited about learning the new technologies.
Roles and responsibilities
- Participate and contribute in design and development of the core
components of the Filo service.
- Ensure high quality of software development w.r.t to project
architecture, code quality, testing and deployment.
- Implement testing frameworks and disciplines as part of every feature
development.
- Own performance of app in production and implement/push for
implementation of systems to monitor, debug and fix an issue in
production in lowestTATpossible.
- Advocate good engineering has the highest priority with the only exception
of value delivered to the end user.
Benefits
- MacBook Pro goes without saying
- Stock Grants and Discounts
- Flexible Working Hours
- Flexible core working hours
- Development budget (conferences, training, Udemy, language classes)
- Internal tech guilds, Hackathon and public Meetups
- A learning environment where you can extend and build upon your skills
- Great Office Location
- Regular company parties and team events
Join In
We are a team of educators and engineers who believe there is a lot that can be
done when it comes to how people learn things. We believe while a good book is a
must, so is a good experience and a good teacher but are unfortunately not well
explored.
Join us on this exploration!
* Formulates and recommends standards for achieving maximum performance
and efficiency of the DW ecosystem.
* Participates in the Pre-sales activities for solutions of various customer
problem-statement/situations.
* Develop business cases and ROI for the customer/clients.
* Interview stakeholders and develop BI roadmap for success given project
prioritization
* Evangelize self-service BI and visual discovery while helping to automate any
manual process at the client site.
* Work closely with the Engineering Manager to ensure prioritization of
customer deliverables.
* Champion data quality, integrity, and reliability throughout the organization by
designing and promoting best practices.
*Implementation 20%
* Help DW/DE team members with issues needing technical expertise or
complex systems and/or programming knowledge.
* Provide on-the-job training for new or less experienced team members.
* Develop a technical excellence team
Requirements
- experience designing business intelligence solutions
- experience with ETL Process, Data warehouse architecture
- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,
Synapse, Azure Databricks, and Power BI
- Good analytical and problem-solving skills
- Fluent in relational database concepts and flat file processing concepts
- Must be knowledgeable in software development lifecycles/methodologies
- The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action.
- Mine and analyze data from company databases to drive optimization and improvement of product development, marketing techniques and business strategies.
- Assess the effectiveness and accuracy of new data sources and data gathering techniques.
- Develop custom data models and algorithms to apply to data sets.
- Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
- Develop company A/B testing framework and test model quality.
- Develop processes and tools to monitor and analyze model performance and data accuracy.
Roles & Responsibilities
- Experience using statistical languages (R, Python, SQL, etc.) to manipulate data and draw insights from large data sets.
- Experience working with and creating data architectures.
- Looking for someone with 3-7 years of experience manipulating data sets and building statistical models
- Has a Bachelor's, Master's in Computer Science or another quantitative field
- Knowledge and experience in statistical and data mining techniques :
- GLM/Regression, Random Forest, Boosting, Trees, text mining,social network analysis, etc.
- Experience querying databases and using statistical computer languages :R, Python, SQL, etc.
- Experience creating and using advanced machine learning algorithms and statistics: regression, simulation, scenario analysis, modeling, clustering, decision trees,neural networks, etc.
- Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, Gurobi, MySQL, etc.
- Experience visualizing/presenting data for stakeholders using: Periscope, Business Objects, D3, ggplot, etc.
Senior Product Analyst
Pampers Start Up Team
India / Remote Working
Team Description
Our internal team focuses on App Development with data a growing area within the structure. We have a clear vision and strategy which is coupled up with App Development, Data, Testing, Solutions and Operations. The data team sits across the UK and India whilst other teams sit across Dubai, Lebanon, Karachi and various cities in India.
Role Description
In this role you will use a range of tools and technologies to primarily working on providing data design, data governance, reporting and analytics on the Pampers App.
This is a unique opportunity for an ambitious candidate to join a growing business where they will get exposure to a diverse set of assignments, can contribute fully to the growth of the business and where there are no limits to career progression and reward.
Responsibilities
● To be the Data Steward and drive governance having full understanding of all the data that flows through the Apps to all systems
● Work with the campaign team to do data fixes when issues with campaigns
● Investigate and troubleshoot issues with product and campaigns giving clear RCA and impact analysis
● Document data, create data dictionaries and be the “go to” person in understanding what data flows
● Build dashboards and reports using Amplitude, Power BI and present to the key stakeholders
● Carry out adhoc data investigations into issues with the app and present findings back querying data in BigQuery/SQL/CosmosDB
● Translate analytics into a clear powerpoint deck with actionable insights
● Write up clear documentation on processes
● Innovate with new processes or ways of providing analytics and reporting
● Help the data lead to find new ways of adding value
Requirements
● Bachelor’s degree and a minimum of 4+ years’ experience in an analytical role preferably working in product analytics with consumer app data
● Strong SQL Server and Power BI required
● You have experience with most or all of these tools – SQL Server, Python, Power BI, BigQuery.
● Understanding of mobile app data (Events, CTAs, Screen Views etc)
● Knowledge of data architecture and ETL
● Experience in analyzing customer behavior and providing insightful recommendations
● Self-starter, with a keen interest in technology and highly motivated towards success
● Must be proactive and be prepared to address meetings
● Must show initiative and desire to learn business subjects
● Able to work independently and provide updates to management
● Strong analytical and problem-solving capabilities with meticulous attention to detail
● Excellent problem-solving skills; proven teamwork and communication skills
● Experience working in a fast paced “start-up like” environment
Desirable
- Knowledge of mobile analytical tools (Segment, Amplitude, Adjust, Braze and Google Analytics)
- Knowledge of loyalty data
Company Profile:
Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.
We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.
Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.
Salary: As per company standards.
Designation: Data Engineering
Location: Pune
Experience with ETL, Data Modeling, and Data Architecture
Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.
Experience with AWS cloud data lake for development of real-time or near real-time use cases
Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing
Build data pipeline frameworks to automate high-volume and real-time data delivery
Create prototypes and proof-of-concepts for iterative development.
Experience with NoSQL databases, such as DynamoDB, MongoDB etc
Create and maintain optimal data pipeline architecture,
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow
Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.
Employment Type
Full-time
Job Description
Experience: 3+ yrs
We are looking for a MySQL DBA who will be responsible for ensuring the performance, availability, and security of clusters of MySQL instances. You will also be responsible for design of database, database architecture, orchestrating upgrades, backups, and provisioning of database instances. You will also work in tandem with the other teams, preparing documentations and specifications as required.
Responsibilities:
Database design and data architecture
Provision MySQL instances, both in clustered and non-clustered configurations
Ensure performance, security, and availability of databases
Prepare documentations and specifications
Handle common database procedures, such as upgrade, backup, recovery, migration, etc.
Profile server resource usage, optimize and tweak as necessary
Skills and Qualifications:
Proven expertise in database design and data architecture for large scale systems
Strong proficiency in MySQL database management
Decent experience with recent versions of MySQL
Understanding of MySQL's underlying storage engines, such as InnoDB and MyISAM
Experience with replication configuration in MySQL
Knowledge of de-facto standards and best practices in MySQL
Proficient in writing and optimizing SQL statements
Knowledge of MySQL features, such as its event scheduler
Ability to plan resource requirements from high level specifications
Familiarity with other SQL/NoSQL databases such as Cassandra, MongoDB, etc.
Knowledge of limitations in MySQL and their workarounds in contrast to other popular relational databases
Unique Data Solutions Provider
• Ability to understand customer requirements and create customized demonstrations and
collateral
• Provide product feedback (feature requests, user experience) to the development team
• Strong foundation in system level architectures and compute, storage and networking
infrastructure, specifically:
• Compute architectures – physical and virtualized, operating systems (Linux strongly
preferred)
• Storage systems – file systems, object stores
• On-prem data center and public cloud (AWS, Azure, Google Cloud) environments
• Hands-on experience with Linux/Unix systems as a system administrator or equivalent role
involving installing software and security patches, installing hardware components on servers as
per product manuals etc.
• Hands-on experience working with public cloud infrastructure and services. Cloud certifications
are preferred.
• Basic understanding of enterprise system deployment architecture around network configuration,
security related settings etc.
• Experience troubleshooting configuration issues to resolve them independently or in collaboration
with customer support teams.
• Be able to work with development/L3 support teams to live debug any issues for swift resolution
• Experience with programming or scripting languages such as Python, JAVA, GO is preferred.
• Experience with data management, DevOps, micro-services, containerization
Technical must haves:
● Extensive exposure to at least one Business Intelligence Platform (if possible, QlikView/Qlik
Sense) – if not Qlik, ETL tool knowledge, ex- Informatica/Talend
● At least 1 Data Query language – SQL/Python
● Experience in creating breakthrough visualizations
● Understanding of RDMS, Data Architecture/Schemas, Data Integrations, Data Models and Data Flows is a must
● A technical degree like BE/B. Tech a must
Technical Ideal to have:
● Exposure to our tech stack – PHP
● Microsoft workflows knowledge
Behavioural Pen Portrait:
● Must Have: Enthusiastic, aggressive, vigorous, high achievement orientation, strong command
over spoken and written English
● Ideal: Ability to Collaborate
Preferred location is Ahmedabad, however, if we find exemplary talent then we are open to remote working model- can be discussed.
- 6+ years of recent hands-on Java development
- Developing data pipelines in AWS or Google Cloud
- Java, Python, JavaScript programming languages
- Great understanding of designing for performance, scalability, and reliability of data intensive application
- Hadoop MapReduce, Spark, Pig. Understanding of database fundamentals and advanced SQL knowledge.
- In-depth understanding of object oriented programming concepts and design patterns
- Ability to communicate clearly to technical and non-technical audiences, verbally and in writing
- Understanding of full software development life cycle, agile development and continuous integration
- Experience in Agile methodologies including Scrum and Kanban
Pingahla is recruiting business intelligence Consultants/Senior consultants who can help us with Information Management projects (domestic, onshore and offshore) as developers and team leads. The candidates are expected to have 3-6 years of experience with Informatica Power Center/Talend DI/Informatica Cloud and must be very proficient with Business Intelligence in general. The job is based out of our Pune office.
Responsibilities:
- Manage the customer relationship by serving as the single point of contact before, during and after engagements.
- Architect data management solutions.
- Provide technical leadership to other consultants and/or customer/partner resources.
- Design, develop, test and deploy data integration solutions in accordance with customer’s schedule.
- Supervise and mentor all intermediate and junior level team members.
- Provide regular reports to communicate status both internally and externally.
- Qualifications:
- A typical profile that would suit this position would be if the following background:
- A graduate from a reputed engineering college
- An excellent I.Q and analytical skills and should be able to grasp new concepts and learn new technologies.
- A willingness to work with a small team in a fast-growing environment.
- A good knowledge of Business Intelligence concepts
Mandatory Requirements:
- Knowledge of Business Intelligence
- Good knowledge of at least one of the following data integration tools - Informatica Powercenter, Talend DI, Informatica Cloud
- Knowledge of SQL
- Excellent English and communication skills
- Intelligent, quick to learn new technologies
- Track record of accomplishment and effectiveness with handling customers and managing complex data management needs
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet business requirements
- Identifying, designing, and implementing internal process improvements including redesigning infrastructure for greater scalability, optimizing data delivery, and automating manual processes
- Work with Data, Analytics & Tech team to extract, arrange and analyze data
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies
- Building analytical tools to utilize the data pipeline, providing actionable insight into key business performance metrics including operational efficiency and customer acquisition
- Works closely with all business units and engineering teams to develop a strategy for long-term data platform architecture.
- Working with stakeholders including data, design, product, and executive teams, and assisting them with data-related technical issues
- Working with stakeholders including the Executive, Product, Data, and Design teams to support their data infrastructure needs while assisting with data-related technical issues.
- SQL
- Ruby or Python(Ruby preferred)
- Apache-Hadoop based analytics
- Data warehousing
- Data architecture
- Schema design
- ML
- Prior experience of 2 to 5 years as a Data Engineer.
- Ability in managing and communicating data warehouse plans to internal teams.
- Experience designing, building, and maintaining data processing systems.
- Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions.
- Excellent analytic skills associated with working on unstructured datasets.
- Ability to build processes that support data transformation, workload management, data structures, dependency, and metadata.
Responsibilities for Data Architect
- Research and properly evaluate sources of information to determine possible limitations in reliability or usability
- Apply sampling techniques to effectively determine and define ideal categories to be questioned
- Compare and analyze provided statistical information to identify patterns, relationships and problems
- Define and utilize statistical methods to solve industry-specific problems in varying fields, such as economics and engineering
- Prepare detailed reports for management and other departments by analyzing and interpreting data
- Train assistants and other members of the team how to properly organize findings and read data collected
- Design computer code using various languages to improve and update software and applications
- Refer to previous instances and findings to determine the ideal method for gathering data