Cutshort logo
Data modeling jobs

50+ Data modeling Jobs in India

Apply to 50+ Data modeling Jobs on CutShort.io. Find your next job, effortlessly. Browse Data modeling Jobs and apply today!

icon
Fountane inc
HR Fountane
Posted by HR Fountane
Remote only
5 - 8 yrs
₹18L - ₹28L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+9 more

Position Overview: We are looking for an experienced and highly skilled Data Architect to join our team and help design, implement, and optimize data systems that support high-end analytical solutions for our clients. As a customer-centric Data Architect, you will work closely with clients to understand their business needs and translate them into robust, scalable, and efficient technical solutions. You will be responsible for end-to-end data modelling, integration workflows, and data transformation processes while ensuring security, privacy, and compliance.In this role, you will also leverage the latest advancements in artificial intelligence, machine learning, and large language models (LLMs) to deliver high-impact solutions that drive business success. The ideal candidate will have a deep understanding of data infrastructure, optimization techniques, and cost-effective data management


Key Responsibilities:


• Customer Collaboration:

– Partner with clients to gather and understand their business

requirements, translating them into actionable technical specifications.

– Act as the primary technical consultant to guide clients through data challenges and deliver tailored solutions that drive value.


•Data Modeling & Integration:

– Design and implement scalable, efficient, and optimized data models to support business operations and analytical needs.

– Develop and maintain data integration workflows to seamlessly extract, transform, and load (ETL) data from various sources into data repositories.

– Ensure smooth integration between multiple data sources and platforms, including cloud and on-premise systems


Data Processing & Optimization:

– Develop, optimize, and manage data processing pipelines to enable real-time and batch data processing at scale.

– Continuously evaluate and improve data processing performance, optimizing for throughput while minimizing infrastructure costs.


• Data Governance & Security:

–Implement and enforce data governance policies and best practices, ensuring data security, privacy, and compliance with relevant industry regulations (e.g., GDPR, HIPAA).

–Collaborate with security teams to safeguard sensitive data and maintain privacy controls across data environments.


• Cross-Functional Collaboration:

– Work closely with data engineers, data scientists, and business

analysts to ensure that the data architecture aligns with organizational objectives and delivers actionable insights.

– Foster collaboration across teams to streamline data workflows and optimize solution delivery.


• Leveraging Advanced Technologies:

– Utilize AI, machine learning models, and large language models (LLMs) to automate processes, accelerate delivery, and provide

smart, data-driven solutions to business challenges.

– Identify opportunities to apply cutting-edge technologies to improve the efficiency, speed, and quality of data processing and analytics.


• Cost Optimization:

–Proactively manage infrastructure and cloud resources to optimize throughput while minimizing operational costs.

–Make data-driven recommendations to reduce infrastructure overhead and increase efficiency without sacrificing performance.


Qualifications:


• Experience:

– Proven experience (5+ years) as a Data Architect or similar role, designing and implementing data solutions at scale.

– Strong expertise in data modelling, data integration (ETL), and data transformation processes.

– Experience with cloud platforms (AWS, Azure, Google Cloud) and big data technologies (e.g., Hadoop, Spark).


• Technical Skills:

– Advanced proficiency in SQL, data modelling tools (e.g., Erwin,PowerDesigner), and data integration frameworks (e.g., Apache

NiFi, Talend).

– Strong understanding of data security protocols, privacy regulations, and compliance requirements.

– Experience with data storage solutions (e.g., data lakes, data warehouses, NoSQL, relational databases).


• AI & Machine Learning Exposure:

– Familiarity with leveraging AI and machine learning technologies (e.g., TensorFlow, PyTorch, scikit-learn) to optimize data processing and analytical tasks.

–Ability to apply advanced algorithms and automation techniques to improve business processes.


• Soft Skills:

– Excellent communication skills to collaborate with clients, stakeholders, and cross-functional teams.

– Strong problem-solving ability with a customer-centric approach to solution design.

– Ability to translate complex technical concepts into clear, understandable terms for non-technical audiences.


• Education:

– Bachelor’s or Master’s degree in Computer Science, Information Systems, Data Science, or a related field (or equivalent practical experience).


LIFE AT FOUNTANE:

  • Fountane offers an environment where all members are supported, challenged, recognized & given opportunities to grow to their fullest potential.
  • Competitive pay
  • Health insurance for spouses, kids, and parents.
  • PF/ESI or equivalent
  • Individual/team bonuses
  • Employee stock ownership plan
  • Fun/challenging variety of projects/industries
  • Flexible workplace policy - remote/physical
  • Flat organization - no micromanagement
  • Individual contribution - set your deadlines
  • Above all - culture that helps you grow exponentially!


A LITTLE BIT ABOUT THE COMPANY:

Established in 2017, Fountane Inc is a Ventures Lab incubating and investing in new competitive technology businesses from scratch. Thus far, we’ve created half a dozen multi-million valuation companies in the US and a handful of sister ventures for large corporations, including Target, US Ventures, and Imprint Engine.

We’re a team of 80+ strong from around the world that are radically open-minded and believes in excellence, respecting one another, and pushing our boundaries to the furthest it's ever been.



Read more
Tech Prescient

at Tech Prescient

2 candid answers
3 recruiters
Ashwini Kulkarni
Posted by Ashwini Kulkarni
Remote, Pune
5 - 7 yrs
₹15L - ₹22L / yr
PowerBI
Data modeling
SQL Query Analyzer

Job Title: - Power BI Analyst with SQL Expertise

Experience level : 5 years

Location - Pune / Remote Job


Overview:

We are looking for a talented Power BI Analyst with strong analytical skills and proficiency in SQL to join our team. This role involves developing insightful dashboards and reports using Power BI, conducting in-depth data analysis, and translating data insights into actionable business recommendations. The ideal candidate will have a knack for turning data into stories and possess a solid foundation in SQL for efficient querying and data manipulation.


Must-Have Skills:

  • Power BI Proficiency: Expertise in designing and developing complex Power BI dashboards and reports.
  • SQL Querying: Strong SQL skills for data extraction, manipulation, and complex query building.
  • Analytical Skills: Ability to analyze large datasets to identify trends, patterns, and insights that support strategic decisions.
  • Data Modeling: Experience with data modeling techniques to ensure optimal performance in Power BI reports.
  • Communication: Ability to communicate insights effectively to non-technical stakeholders.

Good-to-Have Skills:

  • DAX (Data Analysis Expressions): Experience with DAX for advanced calculations in Power BI.
  • Experience with BI Tools: Familiarity with other BI tools such as Tableau, QlikView, or similar.
  • Python or R: Knowledge of Python or R for data analysis tasks outside of Power BI.
  • Cloud Platforms: Familiarity with data warehousing solutions on AWS, Azure, or GCP.


Key Responsibilities and Duties:

  • Design, develop, and maintain Power BI reports and dashboards that meet business needs.
  • Write, optimize, and troubleshoot complex SQL queries for data extraction and transformation.
  • Perform data analysis to uncover trends and insights, supporting strategic and operational decisions.
  • Collaborate with stakeholders to gather and refine reporting requirements.
  • Ensure data integrity, accuracy, and reliability in all reports and analytics.

Required Qualifications:

  • Bachelor’s degree in Computer Science, Data Analytics, Statistics, or a related field.
  • Proven experience in Power BI report development and SQL querying.
  • Strong analytical skills with a track record of interpreting data to support business decisions.


Preferred Qualifications:

  • Master’s degree in Data Science, Business Analytics, or related fields.
  • Certifications in Power BI, SQL, or data analytics.
  • Experience with DAX, Python, or R.


Other Competencies:

  • Problem-Solving: Ability to approach data challenges with innovative solutions.
  • Attention to Detail: Keen eye for detail to ensure data accuracy and clarity in reports.
  • Collaboration: Works well in a team environment, especially with cross-functional stakeholders.
Read more
OnActive
Mansi Gupta
Posted by Mansi Gupta
Remote only
8 - 12 yrs
₹20L - ₹25L / yr
PowerBI
DAX
Data modeling
SQL
skill iconPython

Senior Data Analyst


Experience: 8+ Years

Work Mode: Remote Full Time


Responsibilities:

• Analyze large datasets to uncover trends, patterns, and insights to support business goals.

• Design, develop, and manage interactive dashboards and reports using Power BI.

• Utilize DAX and SQL for advanced data querying and data modeling.

• Create and manage complex SQL queries for data extraction, transformation, and loading processes.

• Collaborate with cross-functional teams to understand data requirements and translate them into actionable solutions.

• Maintain data accuracy and integrity across projects, ensuring reliable data-driven insights.

• Present findings to stakeholders, translating complex data insights into simple, actionable business recommendations.


Skills:

Power BI, DAX (Data Analysis Expressions), SQL, Data Modeling, Python


Preferred Skills:

• Machine Learning: Exposure to machine learning models and their integration within analytical solutions.

• Microsoft Fabric: Familiarity with Microsoft Fabric for enhanced data integration and management.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 5 yrs
Best in industry
SQL
skill iconAmazon Web Services (AWS)
skill iconPython
Data Warehouse (DWH)
Informatica
+5 more

Job Description for Data Engineer Role:- 

Must have:

Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.

Experience working with Databases and SQL optimizations

Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.

Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward. 

 

Role and Responsibilities:

  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
  • Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
  • Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
  • Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards

Qualifications -

  • Experience with SQL and NoSQL databases.
  • Experience with cloud platforms, preferably AWS.
  • Strong experience with data warehousing and data lake technologies (Snowflake)
  • Expertise in data modelling
  • Experience with ETL/LT tools and methodologies
  • Experience working on real-time Data Streaming and Data Streaming platforms
  • 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
  • 2+ years working with SQL and NoSQL databases, data modeling and data management
  • 2+ years of experience with AWS, GCP, Azure, or another cloud service.


Read more
Hyderabad
3 - 6 yrs
₹10L - ₹16L / yr
SQL
Spark
Analytical Skills
Hadoop
Communication Skills
+4 more

The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.


Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.


Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.


Actively participates with other consultants in problem-solving and approach development.


Responsibilities :


Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.


Perform data analysis to validate data models and to confirm the ability to meet business needs.


Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.


Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.


Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.


Coordinate with Data Architects, Program Managers and participate in recurring meetings.


Help and mentor team members to understand the data model and subject areas.


Ensure that the team adheres to best practices and guidelines.


Requirements :


- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.


- Experience with Spark optimization/tuning/resource allocations


- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.


- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).


- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.


- Have a deep understanding of the various stacks and components of the Big Data ecosystem.


- Hands-on experience with Python is a huge plus

Read more
Webkul Software PvtLtd
Khushboo Tripathi
Posted by Khushboo Tripathi
Noida
0 - 1 yrs
₹3L - ₹5L / yr
Salesforce
Data modeling
Data security
Automation
Salesforce Apex
+4 more

Experience: 0 - 1 year

Requirement: 

Expertise in Manual testing

Good Understanding of the Salesforce Platform

Experience of Salesforce App testing

Salesforce Certified professional would be preferred

ISTQB certified professional would be preferred


Rounds:

R1: Written - multiple Choice and scenario-based question

R2: F2F Interview


The candidate's Salesforce trailhead profile link is a must. Please ask each of the applicants to provide the same beforehand for shortlisting.

Read more
Client based at Pune location.

Client based at Pune location.

Agency job
Pune
5 - 9 yrs
₹18L - ₹30L / yr
Data Engineer
Python
Datawarehousing
Snow flake schema
Data modeling
+7 more

Skills & Experience:

❖ At least 5+ years of experience as a Data Engineer

❖ Hands-on and in-depth experience with Star / Snowflake schema design, data modeling,

data pipelining and MLOps.

❖ Experience in Data Warehouse technologies (e.g. Snowflake, AWS Redshift, etc)

❖ Experience in AWS data pipelines (Lambda, AWS glue, Step functions, etc)

❖ Proficient in SQL

❖ At least one major programming language (Python / Java)

❖ Experience with Data Analysis Tools such as Looker or Tableau

❖ Experience with Pandas, Numpy, Scikit-learn, and Jupyter notebooks preferred

❖ Familiarity with Git, GitHub, and JIRA.

❖ Ability to locate & resolve data quality issues

❖ Ability to demonstrate end to ed data platform support experience

Other Skills:

❖ Individual contributor

❖ Hands-on with the programming

❖ Strong analytical and problem solving skills with meticulous attention to detail

❖ A positive mindset and can-do attitude

❖ To be a great team player

❖ To have an eye for detail

❖ Looking for opportunities to simplify, automate tasks, and build reusable components.

❖ Ability to judge suitability of new technologies for solving business problems

❖ Build strong relationships with analysts, business, and engineering stakeholders

❖ Task Prioritization

❖ Familiar with agile methodologies.

❖ Fintech or Financial services industry experience

❖ Eagerness to learn, about the Private Equity/Venture Capital ecosystem and associated

secondary market

Responsibilities:

o Design, develop and maintain a data platform that is accurate, secure, available, and fast.

o Engineer efficient, adaptable, and scalable data pipelines to process data.

o Integrate and maintain a variety of data sources: different databases, APIs, SAASs, files, logs,

events, etc.

o Create standardized datasets to service a wide variety of use cases.

o Develop subject-matter expertise in tables, systems, and processes.

o Partner with product and engineering to ensure product changes integrate well with the

data platform.

o Partner with diverse stakeholder teams, understand their challenges and empower them

with data solutions to meet their goals.

o Perform data quality on data sources and automate and maintain a quality control

capability.

Read more
Tech company based out in the US

Tech company based out in the US

Agency job
via Grow Your Staff by Erica Cyril
Remote only
2 - 5 yrs
₹8L - ₹10L / yr
Microsoft Visio
CorelDRAW
MS-Excel
MS-PowerPoint
BRD
+3 more

Grow Your Staff is looking for a Technical Business Analyst for an InsurTech firm in the US. The position is a full-time remote opportunity.


The role will have excellent growth opportunities. You will work directly with the team based in the US. 


Experience required: 2-5 years 

CTC: INR 7- 12 LPA

Location: Remote

Type of Employment- Full-time

Time- 5:30pm-2:30am (Monday-Friday) 


Responsibilities

  • Performing requirements analysis.
  • Evaluating business requirements, uncovering areas for improvement, and developing and implementing solutions.
  • Documenting business processes, developing optimization strategies, and procedures.
  • Staying up to date on the latest process and IT advancements to build and enhance digital applications 
  • Conducting meetings and presentations to share ideas and findings.
  • Documenting and communicating the results of your efforts.
  • Effectively communicating your insights and plans to cross-functional team members and management.
  • Ensuring solutions meet business needs and requirements.
  • Performing user acceptance testing.
  • Managing projects, developing project plans, and monitoring performance.
  • Prioritising initiatives based on business needs and requirements.


Qualifications

  • A bachelor’s degree in business or a related field.
  • Experience with Software Development Life Cycle (SDLC) & Agile Development
  • Exceptional analytical and conceptual thinking skills.
  • The ability to influence stakeholders and work closely with them to determine acceptable solutions.
  • Competency in applications including Word, Excel, Visio/CorelDraw/Adobe
  • Fundamental analytical and conceptual thinking skills.
  • UX Design Skills (preferable)
  • Experience creating detailed reports and giving presentations.
  • Excellent planning, organisational, and time management skills.


PLOT NO 1253, ROAD NO 63, JUBILEE HILLS, Hyderabad, Telangana, 500033

https://www.growyourstaff.com/

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Pune, Mumbai
10 - 22 yrs
Best in industry
DynamoDB
AWS
EMR
Data migration
Data modeling

Responsibilities:

 

  • Design, implement, and maintain scalable and reliable database solutions on the AWS platform.
  • Architect, deploy, and optimize DynamoDB databases for performance, scalability, and cost-efficiency.
  • Configure and manage AWS OpenSearch (formerly Amazon Elasticsearch Service) clusters for real-time search and analytics capabilities.
  • Design and implement data processing and analytics solutions using AWS EMR (Elastic MapReduce) for large-scale data processing tasks.
  • Collaborate with cross-functional teams to gather requirements, design database solutions, and implement best practices.
  • Perform performance tuning, monitoring, and troubleshooting of database systems to ensure high availability and performance.
  • Develop and maintain documentation, including architecture diagrams, configurations, and operational procedures.
  • Stay current with the latest AWS services, database technologies, and industry trends to provide recommendations for continuous improvement.
  • Participate in the evaluation and selection of new technologies, tools, and frameworks to enhance database capabilities.
  • Provide guidance and mentorship to junior team members, fostering knowledge sharing and skill development.

 

Requirements:

 

  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Proven experience as an AWS Architect or similar role, with a focus on database technologies.
  • Hands-on experience designing, implementing, and optimizing DynamoDB databases in production environments.
  • In-depth knowledge of AWS OpenSearch (Elasticsearch) and experience configuring and managing clusters for search and analytics use cases.
  • Proficiency in working with AWS EMR (Elastic MapReduce) for big data processing and analytics.
  • Strong understanding of database concepts, data modelling, indexing, and query optimization.
  • Experience with AWS services such as S3, EC2, RDS, Redshift, Lambda, and CloudFormation.
  • Excellent problem-solving skills and the ability to troubleshoot complex database issues.
  • Solid understanding of cloud security best practices and experience implementing security controls in AWS environments.
  • Strong communication and collaboration skills with the ability to work effectively in a team environment.
  • AWS certifications such as AWS Certified Solutions Architect, AWS Certified Database - Specialty, or equivalent certifications are a plus.


Read more
Fictiv

at Fictiv

1 video
7 recruiters
Margaret Moses
Posted by Margaret Moses
Pune
5 - 12 yrs
Best in industry
Salesforce Apex
Salesforce Visualforce
Salesforce Lightning
Salesforce development
Data modeling

What’s in it for you?

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Opportunity To Grow Your Career

At Fictiv, you'll be surrounded by supportive teammates who will push you to be your best through their curiosity and passion.

 

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Impact In This Role

Excellent problem solving, decision-making and critical thinking skills.

Collaborative, a team player.

Excellent verbal and written communication skills.

Exhibits initiative, integrity and empathy.

Enjoy working with a diverse group of people in multiple regions.

Comfortable not knowing answers, but resourceful and able to resolve issues.

Self-starter; comfortable with ambiguity, asking questions and constantly learning.

Customer service mentality; advocates for another person's point of view.

Methodical and thorough in written documentation and communication.

Culture oriented; wants to work with people rather than in isolation.

You will report to the Director of IT Engineering

What You’ll Be Doing

  • Interface with Business Analysts and Stakeholders to understand & clarify requirements
  • Develop technical design for solutions Development
  • Implement high quality, scalable solutions following best practices, including configuration and code.
  • Deploy solutions and code using automated deployment tools
  • Take ownership of technical deliverables, ensure that quality work is completed, fully tested, delivered on time.
  • Conduct code reviews, optimization, and refactoring to minimize technical debt within Salesforce implementations.
  • Collaborate with cross-functional teams to integrate Salesforce with other systems and platforms, ensuring seamless data flow and system interoperability.
  • Identify opportunities for process improvements, mentor and support other developers/team members as needed.
  • Stay updated on new Salesforce features and functionalities and provide recommendations for process improvements.


Desired Traits

  • 8-10 years of experience in Salesforce development
  • Proven experience in developing Salesforce solutions with a deep understanding of Apex, Visualforce, Lightning Web Components, and Salesforce APIs.
  • Have worked in Salesforce CPQ, Sales/Manufacturing Cloud, Case Management
  • Experienced in designing and implementing custom solutions that align with business needs.
  • Strong knowledge of Salesforce data modeling, reporting, and database design.
  • Demonstrated experience in building and maintaining integrations between Salesforce and external applications.
  • Strong unit testing, functional testing and debugging skills
  • Strong understanding of best practices
  • Active Salesforce Certifications are desirable.
  • Experience in Mulesoft is a plus
  • Excellent communication skills and the ability to translate complex technical requirements into actionable solutions.

Interested in learning more? We look forward to hearing from you soon.

Read more
PloPdo
Chandan Nadkarni
Posted by Chandan Nadkarni
Hyderabad
3 - 12 yrs
₹22L - ₹25L / yr
Cassandra
Data modeling

Responsibilities -

  • Collaborate with the development team to understand data requirements and identify potential scalability issues.
  • Design, develop, and implement scalable data pipelines and ETL processes to ingest, process, and analyse large - volumes of data from various sources.
  • Optimize data models and database schemas to improve query performance and reduce latency.
  • Monitor and troubleshoot the performance of our Cassandra database on Azure Cosmos DB, identifying bottlenecks and implementing optimizations as needed.
  • Work with cross-functional teams to ensure data quality, integrity, and security.
  • Stay up to date with emerging technologies and best practices in data engineering and distributed systems.


Qualifications & Requirements -

  • Proven experience as a Data Engineer or similar role, with a focus on designing and optimizing large-scale data systems.
  • Strong proficiency in working with NoSQL databases, particularly Cassandra.
  • Experience with cloud-based data platforms, preferably Azure Cosmos DB.
  • Solid understanding of Distributed Systems, Data modelling, Data Warehouse Designing, and ETL Processes.
  • Detailed understanding of Software Development Life Cycle (SDLC) is required.
  • Good to have knowledge on any visualization tool like Power BI, Tableau.
  • Good to have knowledge on SAP landscape (SAP ECC, SLT, BW, HANA etc).
  • Good to have experience on Data Migration Project.
  • Knowledge of Supply Chain domain would be a plus.
  • Familiarity with software architecture (data structures, data schemas, etc.)
  • Familiarity with Python programming language is a plus.
  • The ability to work in a dynamic, fast-paced, work environment.
  • A passion for data and information with strong analytical, problem solving, and organizational skills.
  • Self-motivated with the ability to work under minimal direction.
  • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.


Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Bengaluru (Bangalore)
10 - 16 yrs
₹10L - ₹15L / yr
databricks
Data modeling
SQL
skill iconPython
AWS Lambda
+2 more

Title:- Lead Data Engineer 


Experience: 10+y

Budget: 32-36 LPA

Location: Bangalore 

Work of Mode: Work from office

Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS

Qualification: Any Engineering degree


Roles and Responsibilities:


• 8 - 10+ years’ experience in developing scalable Big Data applications or solutions on

 distributed platforms.

• Able to partner with others in solving complex problems by taking a broad

 perspective to identify.

• innovative solutions.

• Strong skills building positive relationships across Product and Engineering.

• Able to influence and communicate effectively, both verbally and written, with team

  members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks.

• Experience working in Agile and Scrum development process.

• Experience working in a fast-paced, results-oriented environment.

• Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2,

  IAM etc.

• Experience working with Data Warehousing tools, including SQL database, Presto,

  and Snowflake

• Experience architecting data product in Streaming, Serverless and Microservices

  Architecture and platform.

• Experience working with Data platforms, including EMR, Airflow, Databricks (Data

  Engineering & Delta

• Lake components, and Lakehouse Medallion architecture), etc.

• Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for

  Managed Spark jobs, build Docker images, etc.

• Experience working with distributed technology tools, including Spark, Python, Scala

• Working knowledge of Data warehousing, Data modelling, Governance and Data

  Architecture

• Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite

  etc.

• Demonstrated experience in learning new technologies and skills.

• Bachelor’s degree in computer science, Information Systems, Business, or other

  relevant subject area

Read more
Insurance Org

Insurance Org

Agency job
via InvokHR by Jessica Chen
Gurugram
3 - 8 yrs
₹10L - ₹15L / yr
Guidewire
GOSU
API
User Interface (UI) Development
Data modeling
+2 more

Role Title: Developer - Guidewire Integration-Config

 

 

Role Purpose

We are looking for a Developer for our Claims Guidewire team, who is a technology enthusiast, and eager to be part of a culture of modern software engineering practices, continuous improvement, and innovation.

 

As a Developer, you will be part of a dynamic engineering team and work on development, maintenance, and transformation of our strategic Claims Guidewire platform. You will learn about software applications, technology stack, ways of working and standards.

 

 

Key Accountabilities

 

·        Deliver software development tasks for Claims Guidewire applications, in the areas of Integration and Configuration, with expected quality measures and timeframe, e.g., coding, writing unit test cases (G-Unit) and unit testing, debugging and defect fixing, providing test support, providing release support.

·        Communicate with technical leads and IT groups for understanding the project’s technical implications, dependencies, and potential conflicts.

·        Research issues reported in Production, perform root cause analysis and document them, respond to and resolve technical issues in a timely manner.

·        Perform versioning of the release updates and resolve the code conflicts while merging and promoting the code to higher environments.

·        Develop their technical and functional knowledge on Claims Digital Guidewire platform.

·        Understand and follow Guidewire’s cloud standards for application development.

·        Active participation in team meetings like daily stand-ups, risk forums, planning sessions and retrospectives.



Skills & Experience

·        3+ years of development experience on Guidewire cloud platform and applications, Guidewire certification preferred.

·        Hands on development expertise on Guidewire ClaimCentre with configuration and integration

·        Experience in Guidewire platform (Gosu scripting / Edge APIs / UI / Data Model)

·        Should have knowledge on Admin data loading, Assignment and Segmentation Rules, Pre-update and Validation rules, Authority limits, Financials (checks, reserves, recoveries …)

·        Good experience on LOB configuration and related type-lists

·        Good experience on integration components including plug-ins, messaging (and supporting business rules), batches, REST APIs and programs that call the Guidewire application APIs.

·        Experience on any database Oracle / SQL Server and well versed in SQL.

·        Experience of working in a CI/CD setup and related tools/technologies

·        Insurance domain knowledge with Property & Casualty background preferred.

 


Location- Gurugram

CTC- Upto 25lpa


Read more
PGP Glass Pvt Ltd
Animesh Srivastava
Posted by Animesh Srivastava
Vadodara
1 - 4 yrs
₹6L - ₹13L / yr
Data modeling
skill iconMachine Learning (ML)

Key Roles/Responsibilities: –

• Develop an understanding of business obstacles, create

• solutions based on advanced analytics and draw implications for

• model development

• Combine, explore and draw insights from data. Often large and

• complex data assets from different parts of the business.

• Design and build explorative, predictive- or prescriptive

• models, utilizing optimization, simulation and machine learning

• techniques

• Prototype and pilot new solutions and be a part of the aim

• of ‘productifying’ those valuable solutions that can have impact at a

• global scale

• Guides and coaches other chapter colleagues to help solve

• data/technical problems at an operational level, and in

• methodologies to help improve development processes

• Identifies and interprets trends and patterns in complex data sets to

• enable the business to take data-driven decisions

Read more
This opening is with an MNC

This opening is with an MNC

Agency job
via LK Consultants by Namita Agate
Mumbai, Malad, andheri
8 - 13 yrs
₹13L - ₹22L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+8 more

Minimum of 8 years of experience of which, 4 years should be of applied data mining

experience in disciplines such as Call Centre Metrics.

 Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.

 Experience with leading and managing large teams.

 Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.

 Demonstrated experience with Business Intelligence/Data Mining tools to work with

data, investigate anomalies, construct data sets, and build models.

 Critical to share details on projects undertaken (preferably on telecom industry)

specifically through analysis from CRM.

Read more
hopscotch
Bengaluru (Bangalore)
5 - 8 yrs
₹6L - ₹15L / yr
skill iconPython
Amazon Redshift
skill iconAmazon Web Services (AWS)
PySpark
Data engineering
+3 more

About the role:

 Hopscotch is looking for a passionate Data Engineer to join our team. You will work closely with other teams like data analytics, marketing, data science and individual product teams to specify, validate, prototype, scale, and deploy data pipelines features and data architecture.


Here’s what will be expected out of you:

➢ Ability to work in a fast-paced startup mindset. Should be able to manage all aspects of data extraction transfer and load activities.

➢ Develop data pipelines that make data available across platforms.

➢ Should be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platform.

➢ Work on various aspects of the AI/ML ecosystem – data modeling, data and ML pipelines.

➢ Work closely with Devops and senior Architect to come up with scalable system and model architectures for enabling real-time and batch services.


What we want:

➢ 5+ years of experience as a data engineer or data scientist with a focus on data engineering and ETL jobs.

➢ Well versed with the concept of Data warehousing, Data Modelling and/or Data Analysis.

➢ Experience using & building pipelines and performing ETL with industry-standard best practices on Redshift (more than 2+ years).

➢ Ability to troubleshoot and solve performance issues with data ingestion, data processing & query execution on Redshift.

➢ Good understanding of orchestration tools like Airflow.

 ➢ Strong Python and SQL coding skills.

➢ Strong Experience in distributed systems like spark.

➢ Experience with AWS Data and ML Technologies (AWS Glue,MWAA, Data Pipeline,EMR,Athena, Redshift,Lambda etc).

➢ Solid hands on with various data extraction techniques like CDC or Time/batch based and the related tools (Debezium, AWS DMS, Kafka Connect, etc) for near real time and batch data extraction.


Note :

Product based companies, Ecommerce companies is added advantage

Read more
AI Domain US Based Product Based Company

AI Domain US Based Product Based Company

Agency job
via New Era India by Asha P
Bengaluru (Bangalore)
3 - 10 yrs
₹30L - ₹50L / yr
Data engineering
Data modeling
skill iconPython

Requirements:

  • 2+ years of experience (4+ for Senior Data Engineer) with system/data integration, development or implementation of enterprise and/or cloud software Engineering degree in Computer Science, Engineering or related field.
  • Extensive hands-on experience with data integration/EAI technologies (File, API, Queues, Streams), ETL Tools and building custom data pipelines.
  • Demonstrated proficiency with Python, JavaScript and/or Java
  • Familiarity with version control/SCM is a must (experience with git is a plus).
  • Experience with relational and NoSQL databases (any vendor) Solid understanding of cloud computing concepts.
  • Strong organisational and troubleshooting skills with attention to detail.
  • Strong analytical ability, judgment and problem-solving techniques Interpersonal and communication skills with the ability to work effectively in a cross functional team.


Read more
Lifespark Technologies

at Lifespark Technologies

6 candid answers
1 video
Amey Desai
Posted by Amey Desai
Mumbai
1 - 3 yrs
₹4L - ₹9L / yr
TensorFlow
skill iconMachine Learning (ML)
Computer Vision
skill iconDeep Learning
Time series
+4 more

Lifespark is looking for individuals with a passion for impacting real lives through technology. Lifespark is one of the most promising startups in the Assistive Tech space in India, and has been honoured with several National and International awards. Our mission is to create seamless, persistent and affordable healthcare solutions. If you are someone who is driven to make a real impact in this world, we are your people.

Lifespark is currently building solutions for Parkinson’s Disease, and we are looking for a ML lead to join our growing team. You will be working directly with the founders on high impact problems in the Neurology domain. You will be solving some of the most fundamental and exciting challenges in the industry and will have the ability to see your insights turned into real products every day

 

Essential experience and requirements:

1. Advanced knowledge in the domains of computer vision, deep learning

2. Solid understand of Statistical / Computational concepts like Hypothesis Testing, Statistical Inference, Design of Experiments and production level ML system design

3. Experienced with proper project workflow

4. Good at collating multiple datasets (potentially from different sources)

5. Good understanding of setting up production level data pipelines

6. Ability to independently develop and deploy ML systems to various platforms (local and cloud)

7. Fundamentally strong with time-series data analysis, cleaning, featurization and visualisation

8. Fundamental understanding of model and system explainability

9. Proactive at constantly unlearning and relearning

10. Documentation ninja - can understand others documentation as well as create good documentation

 

Responsibilities :

1. Develop and deploy ML based systems built upon healthcare data in the Neurological domain

2. Maintain deployed systems and upgrade them through online learning

3. Develop and deploy advanced online data pipelines

Read more
Molecular Connections

at Molecular Connections

4 recruiters
Molecular Connections
Posted by Molecular Connections
Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
Foxit Software
Remote only
5 - 12 yrs
₹22L - ₹35L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Web application security
+4 more

Experience Required: 5 -10 yrs.

Job location: Sec-62, Noida

Work from office (Hybrid)


Development Platform: Backend Development- Java/J2EE, Struts, Spring, MySQL, OWASP


Job Brief:

Requirements:

·      5+ years of experience in developing distributed, multi-tier enterprise applications, APIs.

·      Fully participated in several major product development cycles.

·      Solid background in design, OOP, object, and data modelling.

·      Deep working knowledge of Java, Struts,Spring, Relational Database.

·      Experience in design and implementation of service interface and public APIs.

·      Actively involved/writing codes in current project.

·      Development knowledge and experience of working with AWS, Azure etc. will be an added plus.

·      Clear Understanding and Hands on experience on OWASP Top 10 Vulnerability standards like XSS, CSRF, SQL injection, session hijacking, and authorization bypass vulnerabilities.

·      Find and resolve the security concerns on the product/application.

·      Good Documentation, reporting, Strong communication, and collaboration skills with various levels of executives from top management to technical team members across the organization.

·      Strong self-starter who can operate independently.

Read more
Personal Care Product Manufacturing

Personal Care Product Manufacturing

Agency job
via Qrata by Rayal Rajan
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Vithamas Technologies Pvt LTD
Mysore
4 - 6 yrs
₹10L - ₹20L / yr
Data modeling
ETL
Oracle
MS SQLServer
skill iconMongoDB
+4 more

RequiredSkills:


• Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience inExtraction, Transformation and Loading ETLwork using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developerinOracle, MS SQLor another enterprise database with a focus on building data integration process • Candidate should haveanyNoSqltechnology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with BigDataplatforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding of data warehousing concepts and decision support systems.


• Ability to deal with sensitive and confidential material and adhere to worldwide data security and • Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skills.

Read more
Porter.in

at Porter.in

1 recruiter
Agency job
via UPhill HR by Ingit Pandey
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹28L / yr
skill iconPython
SQL
Data Visualization
Data modeling
Predictive modelling
+1 more

Responsibilities

This role requires a person to support business charters & accompanying products by aligning with the Analytics

Manager’s vision, understanding tactical requirements and helping in successful execution. Split would be approx.

70% management + 30% individual contributor. Responsibilities include


Project Management

- Understand business needs and objectives.

- Refine use cases and plan iterations and deliverables - able to pivot as required.

- Estimate efforts and conduct regular task updates to ensure timeline adherence.

- Set and manage stakeholder expectations as required


Quality Execution

- Help BA and SBA resources with requirement gathering and final presentations.

- Resolve blockers regarding technical challenges and decision-making.

- Check final deliverables for correctness and review codes, along with Manager.


KPIs and metrics

- Orchestrate metrics building, maintenance, and performance monitoring.

- Owns and manages data models, data sources, and data definition repo.

- Makes low-level design choices during execution.


Team Nurturing

- Help Analytics Manager during regular one-on-ones + check-ins + recruitment.

- Provide technical guidance whenever required.

- Improve benchmarking and decision-making skills at execution-level.

- Train and get new resources up-to-speed.

- Knowledge building (methodologies) to better position the team for complex problems.


Communication

- Upstream to document and discuss execution challenges, process inefficiencies, and feedback loops.

- Downstream and parallel for context-building, mentoring, stakeholder management.


Analytics Stack

- Analytics : Python / R + SQL + Excel / PPT, Colab notebooks

- Database : PostgreSQL, Amazon Redshift, DynamoDB, Aerospike

- Warehouse : Amazon Redshift

- ETL : Lots of Python + custom-made

- Business Intelligence / Visualization : Metabase + Python/R libraries (location data)

- Deployment pipeline : Docker, Git, Jenkins, AWS Lambda

Read more
Expand My Business
Remote only
5 - 10 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Microservices
Data modeling
skill iconPostgreSQL
MySQL
+13 more

 

Roles and Responsibilities:

 

Perform detailed feature requirements analysis along with a team of Senior Developers, 

define system functionality, work on system design and document the same

● Design/Develop/Improve Cogno AI’s backend infrastructure and stack and build fault￾tolerant, scalable and real-time distributed system

● Own the design, development and deployment of code to improve product and platform 

functionality

● Taking initiative and giving ideas for improving the processes in the technology team 

would lead to better performance of the team and result in robust solutions

● Writing high-performance, reliable and maintainable code

● Support team with timely analysis and debugging of operational issues

● Emphasis on automation and scripting

● Cross-functional communication to deliver projects

● Mentor junior team members technically and manage a team of software engineers

● Taking interviews and making tests for hiring people in the technology team

 

What do we look for?

 

The following are the important eligibility requirements for this Job:

● Bachelor's or Master's degree in computer science or equivalent.

● 5+ years of experience working as a software engineer, preferably in a product-based 

company.

● Experience working with major cloud solutions AWS (preferred), Azure, and GCP.

● Familiarity with 3-Tier, microservices architecture and distributed systems

● Experience with the design & development of RESTful services

● Experience with developing Linux-based applications, networking and scripting.

● Experience with different data stores, data modelling and scaling them

● Familiarity with data stores such as PostgreSQL, MySQL, Mongo-DB etc.

● 4+ years of experience with web frameworks (preferably Django, Flask etc.)

● Good understanding of data structures, multi-threading and concurrency concepts.

● Experience with DevOps tools like Jenkins, Ansible, Kubernetes, and Git is a plus.

● Familiarity with elastic search queries and visualization tools like grafana, kibana

● Strong networking fundamentals: Firewalls, Proxies, DNS, Load Balancing, etc.

● Strong analytical and problem-solving skills.

● Excellent written and verbal communication skills.

● Team player, flexible and able to work in a fast-paced environment.

● End-to-end ownership of the product. You own what you develop.

Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
8 - 12 yrs
₹45L - ₹65L / yr
skill iconJavascript
SQL
NOSQL Databases
skill iconNodeJS (Node.js)
skill iconReact Native
+8 more

What is the role?

Expected to manage the product plan, engineering, and delivery of Xoxoday Plum. Plum is a rewarding and incentives infrastructure for businesses. It's a unified integrated suite of products to handle various rewarding use cases for consumers, sales, channel partners, and employees. 31% of the total tech team is aligned towards this product and comprises of 32 members within Plum Tech, Quality, Design, and Product management. The annual FY 2019-20 revenue for Plum was $ 40MN and is showing high growth potential this year as well. The product has a good mix of both domestic and international clientele and is expanding. The role will be based out of our head office in Bangalore, Karnataka however we are open to discuss the option of remote working with 25 - 50% travel.

Key Responsibilities

  • Scope and lead technology with the right product and business metrics.
  • Directly contribute to product development by writing code if required.
  • Architect systems for scale and stability.
  • Serve as a role model for our high engineering standards and bring consistency to the many codebases and processes you will encounter.
  • Collaborate with stakeholders across disciplines like sales, customers, product, design, and customer success.
  • Code reviews and feedback.
  • Build simple solutions and designs over complex ones, and have a good intuition for what is lasting and scalable.
  • Define a process for maintaining a healthy engineering culture ( Cadence for one-on-ones, meeting structures, HLDs, Best Practices In development, etc).

What are we looking for?

  • Manage a senior tech team of more than 5 direct and 25 indirect developers.
  • Should have experience in handling e-commerce applications at scale.
  • Should have at least 7+ years of experience in software development, agile processes for international e-commerce businesses.
  • Should be extremely hands-on, full-stack developer with modern architecture.
  • Should exhibit skills to build a good engineering team and culture.
  • Should be able to handle the chaos with product planning, prioritizing, customer-first approach.
  • Technical proficiency
  • JavaScript, SQL, NoSQL, PHP
  • Frameworks like React, ReactNative, Node.js, GraphQL
  • Databases technologies like ElasticSearch, Redis, MySql, Cassandra, MongoDB, Kafka
  • Dev ops to manage and architect infra - AWS, CI/CD (Jenkins)
  • System Architecture w.r.t Microservices, Cloud Development, DB Administration, Data Modeling
  • Understanding of security principles and possible attacks and mitigate them.

Whom will you work with?

You will lead the Plum Engineering team and work in close conjunction with the Tech leads of Plum with some cross-functional stake with other products. You'll report to the co-founder directly.

What can you look for?

A wholesome opportunity in a fast-paced environment with scale, international flavour, backend, and frontend. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore, and Dublin. We have three products in our portfolio: Plum, Empuls, and Compass. Xoxoday works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners, or consumers for better business results.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

Read more
Pune
3 - 5 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
CI/CD
+12 more

As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.

If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!

 

What You'll Do:

  • Creating detailed design, working on development and performing code reviews.
  • Implementing validation and support activities in line with architecture requirements
  • Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
  • Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
  • Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
  • Ownership of product/feature end-to-end for all phases from the development to the production.
  • Ensuring the developed features are scalable and highly available with no quality concerns.
  • Work closely with senior engineers for refining the and implementation.
  • Management and execution against project plans and delivery commitments.
  • Assist directly and indirectly in the continual hiring and development of technical talent.
  • Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.

The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.

 

 

What You'll Need:

 

  • A Bachelor's degree in Computer Science or related technical discipline.
  • 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
  • Fluency with Java, and Spring is good.
  • Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
  • Strong knowledge of Data Structures, Algorithms and CS fundamentals.
  • Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
  • Excellent analytical and reasoning skills
  • Ability to learn new domains and deliver output
  • Hands on Experience with the core AWS services
  • Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)

 

  • Expertise in at least one of the following:

    - Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology

    - Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached

    - Distributed column store databases like Snowflake, Cassandra, or HBase

    - Spark, Flink, Beam, or equivalent streaming data processing frameworks

  • Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
  • Experience building automations and CICD pipelines (integration, testing, deployment)
  • Experience with Kubernetes would be a plus.
  • Good understanding of working with distributed teams using Agile: Scrum, Kanban
  • Strong interpersonal skills as well as excellent written and verbal communication skills

• Attention to detail and quality, and the ability to work well in and across teams

Read more
goqii Technologies

at goqii Technologies

1 video
2 recruiters
Ambalika Handoo
Posted by Ambalika Handoo
Mumbai
1 - 5 yrs
₹3L - ₹12L / yr
skill iconAndroid Development
skill iconKotlin
skill iconJava
SQLite
Data modeling
+2 more
Job Description
Are you bored of writing banking apps, making people click on more ads, or re-skinning or making clones
of Temple Run.
Did you ever think you could use your skills to change the world?
If you consider yourself as more of an artist who paints on the technology canvas, we want you!!!
GOQii is your chance to work with an amazing team at GOQii who are driven by passion and here to
disrupt the wearable technology & fitness space.
Roles and Responsibilities:
 Relevant Experience on native App Development.
 Solid understanding of the full mobile development life cycle.
 UI development with latest framework and techniques.
 Understanding of asynchronous client/server interfacing.
 Solid grip on SQLite and data modelling.
 Experience with 3rd party libraries & APIs - Social, Payment, Network, Crash, and Analytics etc.
 Experience in handling the performance and memory of App using various tools.
 Focus on building high performance, stable and maintainable code.
 A good logical and analytical skills.
 Experience with Git / SVN version control software.
 Thorough understanding of OOP concepts.
 Proficient with Java and Kotlin.
 Clear understanding of Android SDK, Android studio, APIs, DBs, Material Design.
 Realm and Room database.
 Understanding of design patterns.
 Background task, threading concept.
Read more
Ttec Digital
Hyderabad
1 - 12 yrs
₹6L - ₹25L / yr
Software Development
skill icon.NET
.NET Framework
skill iconXML
SOAP
+5 more

Position Description:

TTEC Digital is looking for enthusiastic Developers for Genesys Contact Center products and custom developed Cloud solutions.  As a Developer, you will function as an active member of the Development team in the following phases of a project’s lifecycle: Design, Build, Deploy, Accept, web and windows services, API’s and applications that integrate with our customers back end CRM systems, databases, and external 3rd party API’s.

Responsibilities:

  • Works with customers as needed to translate design requirements into application solutions, ensuring the requirements are met according to the team’s and practice area’s standards and best practices.
  • Communicates with project manager/client to identify application requirements.
  • Ensures applications meet the standards and requirements of both the client and project manager.
  • Conducts tests of the application for functionality, reliability and stabilization.
  • Deploys/implements the application to the client.
  • Maintains and supports existing applications by fixing problems, addressing issues and determining the need for enhancements.
  • Demonstrates concern for meeting client needs in a manner that provides satisfaction and excellent results for the client, leading to additional opportunities within the client account.
  • Performs all tasks within the budget and on time while meeting all necessary project requirements. Communicates regularly if budget and/or scope changes.
  • Demonstrate professionalism and leadership in representing the Company to customers and vendors.
  • Core PureConnect handler development & maintenance.
  • Monitor and respond to system errors. Participate in on-call rotation.
  • Follow-up on and resolve outstanding issues in a timely manner.
  • Update customer to reflect changes in system configuration as needed.
  • Understand system hardware/software to be able to identify problems and provide a remedy.
  • Handle TAC/Engineering escalations as directed by the team lead or team manager.

Requirements

  • Bachelor’s degree in computer science, business, or related area.
  • 3+ years of relevant experience and proven ability as a software developer.
  • Experience with the Microsoft development platform.
  • Experience with .NET Framework.
  • Professional experience with integration services including XML, SOAP, REST, TCP/IP, JavaScript, and HTML.
  • Deep Understanding of application architecture.
  • Familiarity in data modeling and architecture.
  • Deep expertise and familiarity with the Pure Cloud development platform.

We offer an outstanding career development opportunity, a competitive salary along with full comprehensive benefits.  We are looking for individuals with a team player attitude, strong drive for career growth and a passion for excellence in client support, delivery, and satisfaction.

Read more
DASCASE
Remote only
12 - 20 yrs
₹24L - ₹40L / yr
API management
Windows Azure
skill iconSpring Boot
Microservices
Cloud Computing
+4 more

API Lead Developer

 

Job Overview:

As an API developer for a very large client,  you will be filling the role of a hands-on Azure API Developer. we are looking for someone who has the necessary technical expertise to build and maintain sustainable API Solutions to support identified needs and expectations from the client.

 

Delivery Responsibilities

  • Implement an API architecture using Azure API Management, including security, API Gateway, Analytics, and API Services
  • Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects
  • Conduct functional, regression, and load testing on API’s
  • Gather requirements and defining the strategy for application integration
  • Develop using the following types of Integration protocols/principles: SOAP and Web services stack, REST APIs, RESTful, RPC/RFC
  • Analyze, design, and coordinate the development of major components of the APIs including hands on implementation, testing, review, build automation, and documentation
  • Work with DevOps team to package release components to deploy into higher environment

Required Qualifications

  • Expert Hands-on experience in the following:
    • Technologies such as Spring Boot, Microservices, API Management & Gateway, Event Streaming, Cloud-Native Patterns, Observability & Performance optimizations
    • Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
  • At least 5+ years of experience with Azure APIM
  • At least 8+ years’ experience in Azure SaaS and PaaS
  • At least 8+ years’ experience in API Management including technologies such as Mulesoft and Apigee
  • At least last 5 years in consulting with the latest implementation on Azure SaaS services
  • At least 5+ years in MS SQL / MySQL development including data modeling, concurrency, stored procedure development and tuning
  • Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
  • High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
  • Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts

 

Preferred Qualifications:

  • Ability to work as a collaborative team, mentoring and training the junior team members
  • Working knowledge on building and working on/around data integration / engineering / Orchestration
  • Position requires expert knowledge across multiple platforms, integration patterns, processes, data/domain models, and architectures.
  • Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
  • Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
  • Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
  • Experience with Agile Methodology / Scaled Agile Framework (SAFe).
  • Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.

 

Preferred Education/Skills:

  • Prefer Master’s degree
  • Bachelor’s Degree in Computer Science with a minimum of 12+ years relevant experience or equivalent.
Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Nickelfox

at Nickelfox

1 recruiter
Aakriti Ishwar
Posted by Aakriti Ishwar
Noida
5 - 10 yrs
₹1L - ₹21L / yr
skill iconPython
skill iconDjango
skill iconFlask
Data modeling
Design patterns
+2 more
Job Description
  • Lead multiple client projects in the organization. 
  • Define & build technical architecture for projects. 
  • Introduce & ensure right coding standards within the team and ensure that it is maintained in all the projects. 
  • Ensure the quality delivery of projects. 
  • Ensure applications are confirmed to security guidelines wherever required. 
  • Assist pre-sales/sales team for converting raw requirements from potential clients to functional solutions. 
  • Train fellow team members to impose the best practices available. 
  • Work on improving and managing processes within the team. 
  • Implement innovative ideas throughout the team to improve the overall efficiency and quality of the team. 
  • Ensure proper communication & collaboration within the team 


Requirements

7+ Years of experience in developing large scale applications. 
Solid Domain Knowledge and experience with various Design Patterns & Data Modelling (Associations, OOPs Concepts etc.)
Should have exposure to multiple backend technologies, and databases - both relational and NOSQL databases
Should be aware of latest conventions for APIs
Preferred hands-on experience with GraphQL as well as REST API
Must be well-aware of the latest technological advancements for relevant platforms. 
Advanced Concepts of Databases - Views, Stored Procedures, Database Optimization - are a good to have.
Should have Research Oriented Approach 
Solid at Logical thinking and Problem solving 
Solid understanding of Coding Standards, Code Review processes and delivery of quality products 
Experience with various Tools used in Development, Tests & Deployments. 
Sound knowledge of DevOps and CI/CD Pipeline Tools 
Solid experience with Git Workflow on Enterprise projects and larger teams 
Should be good at documentation at project level and code level ; Should have good experience with Agile Methodology and process
Should have a good understanding of server side deployment, scalability, maintainability and handling server security problems.
Should have good understanding on Software UX 
Proficient with communication and good at making software architectural judgments
Expected outcomes 

  • Growing the team and retaining talent, thus, creating an inspiring environment for the team members. 
  • Creating more leadership within the team along with mentoring and guiding new joiners and experienced developers. 
  • Creating growth plans for the team and preparing training guides for other team members. 
  • Refining processes in the team on a regular basis to ensure quality delivery of projects- such as coding standards, project collaboration, code review processes etc. 
  • Improving overall efficiency and team productivity by introducing new methodologies and ideas in the team. 
  • Working on R&D and employing innovative technologies in the company. 
  • Streamlining processes which will result in saving time and cost optimization 
  • Ensuring code review healthiness and shipping superior quality code 


 


Benefits

  • Unlimited learning and growth opportunities 
  • A collaborative and cheerful work environment 
  • Exceptional reward and recognition policy  
  • Outstanding compensation  
  • Flexible work hours  
  • Opportunity to make an impact as your work will directly contribute to our business strategy.

At Nickelfox, you have a chance to craft a career path as unique as you are and become the best version of YOU. You will be part of a team with a ‘no limits’ mindset in an inclusive, people-focused culture. And we’re counting on your unique perspective to help Nickelfox grow even faster.  

Are you passionate about tech? Dedicated to learning? Come, join us to build an extraordinary experience for yourself and a dignified working world for all. 
 

What makes Nickelfox a great place for you?

In Nickelfox, you’ll join a team whose passion for technology and understanding of business has driven the company to serve clients across 25+ countries in just five years. We partner with our customers to fuel their growth story and enable them to make the right decisions with our customized technology services and insights. All in all, we are passionate to see our customers win the day. This is the reason why 80% of our business comes from repeat clients.  

Our mission is to provide dignified employment and an environment that recognizes the uniqueness of every individual and values their expertise, and contribution. We have a culture that encourages everyone to bring their authentic selves to work. Our people enjoy a collaborative work environment with exceptional training and career development. If you like working with a curious, youthful, high-performing team, Nickelfox is the place for you.

Read more
T500

T500

Agency job
via Talent500 by ANSR by Raghu R
Bengaluru (Bangalore)
3 - 9 yrs
₹10L - ₹30L / yr
Informatica MDM
Data modeling
IDQ

Primary Duties and Responsibilities 

  • Experience with Informatica Multidomain MDM 10.4 tool suite preferred
  • Partnering with data architects and engineers to ensure an optimal data model design and implementation for each MDM domain in accordance with industry and MDM best practices
  • Works with data governance and business steward(s) to design, develop, and configure business rules for data validation, standardize, match, and merge
  • Implementation of Data Quality policies, procedures and standards along with Data Governance Team for maintenance of customer, location, product, and other data domains; Experience with Informatica IDQ tool suite preferred.
  • Performs data analysis and source-to-target mapping for ingest and egress of data.
  • Maintain compliance with change control, SDLC, and development standards.
  • Champion the creation and contribution to technical documentation and diagrams.
  • Establishes a technical vision and strategy with the team and works with the team to turn it into reality.
  • Emphasis on coaching and training to cultivate skill development of team members within the department.
  • Responsible for keeping up with industry best practices and trends.
  • Monitor, troubleshoot, maintain, and continuously improve the MDM ecosystem.

Secondary Duties and Responsibilities

  • May participate in off-hours on-call rotation.
  • Attends and is prepared to participate in team, department and company meetings.
  • Performs other job related duties and special projects as assigned.

Supervisory Responsibilities

This is a non-management role

Education and Experience

  • Bachelor's degree in MIS, Computer Sciences, Business Administration, or related field; or High School Degree/General Education Diploma and 4 years of relevant experience in lieu of Bachelor's degree.
  • 5+ years of experience in implementing MDM solutions using Informatica MDM.
  • 2+ years of experience in data stewardship, data governance, and data management concepts.
  • Professional working knowledge of Customer 360 solution
  • Professional working knowledge in multi domain MDM data modeling.
  • Strong understanding of company master data sets and their application in complex business processes and support data profiling, extraction, and cleansing activities using Informatica Data Quality (IDQ).
  • Strong knowledge in the installation and configuration of the Informatica MDM Hub.
  • Familiarity with real-time, near real-time and batch data integration.
  • Strong experience and understanding of Informatica toolsets including Informatica MDM Hub, Informatica Data Quality (IDQ), Informatica Customer 360, Informatica EDC, Hierarchy Manager (HM), Business Entity Service Model, Address Doctor, Customizations & Composite Services
  • Experience with event-driven architectures (e.g. Kafka, Google Pub/Sub, Azure Event Hub, etc.).
  • Professional working knowledge of CI/CD technologies such as Concourse, TeamCity, Octopus, Jenkins, and CircleCI.
  • Team player that exhibits high energy, strategic thinking, collaboration, direct communication and results orientation.

Physical Requirements

  • Visual requirements include: ability to see detail at near range with or without correction. Must be physically able to perform sedentary work: occasionally lifting or carrying objects of no more than 10 pounds, and occasionally standing or walking, reaching, handling, grasping, feeling, talking, hearing and repetitive motions.

Working Conditions

  • The duties of this position are performed through a combination of an open office setting and remote work options. Full remote work options available for employees that reside outside of the Des Moines Metro Area. There is frequent pressure to meet deadlines and handle multiple projects in a day.

Equipment Used to Perform Job

  • Windows, or Mac computer and various software solutions.

Financial Responsibility

  • Responsible for company assets including maintenance of software solutions.

Contacts

  • Has frequent contact with office personnel in other departments related to the position, as well as occasional contact with users and customers. Engages stakeholders from other area in the business.

Confidentiality

  • Has access to confidential information including trade secrets, intellectual property, various financials, and customer data.
Read more
Encubate Tech Private Ltd

Encubate Tech Private Ltd

Agency job
via staff hire solutions by Purvaja Patidar
Mumbai
5 - 6 yrs
₹15L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
Data modeling
ITL
Agile/Scrum
+7 more

Roles and

Responsibilities

Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to

help us in configure and develop new AWS environments for our Enterprise Data Lake,

migrate the on-premise traditional workloads to cloud. Must have a sound

understanding of BI best practices, relational structures, dimensional data modelling,

structured query language (SQL) skills, data warehouse and reporting techniques.

 Extensive experience in providing AWS Cloud solutions to various business

use cases.

 Creating star schema data models, performing ETLs and validating results with

business representatives

 Supporting implemented BI solutions by: monitoring and tuning queries and

data loads, addressing user questions concerning data integrity, monitoring

performance and communicating functional and technical issues.

Job Description: -

This position is responsible for the successful delivery of business intelligence

information to the entire organization and is experienced in BI development and

implementations, data architecture and data warehousing.

Requisite Qualification

Essential

-

AWS Certified Database Specialty or -

AWS Certified Data Analytics

Preferred

Any other Data Engineer Certification

Requisite Experience

Essential 4 -7 yrs of experience

Preferred 2+ yrs of experience in ETL & data pipelines

Skills Required

Special Skills Required

 AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.

 Bigdata: Databricks, Spark, Glue and Athena

 Expertise in Lake Formation, Python programming, Spark, Shell scripting

 Minimum Bachelor’s degree with 5+ years of experience in designing, building,

and maintaining AWS data components

 3+ years of experience in data component configuration, related roles and

access setup

 Expertise in Python programming

 Knowledge in all aspects of DevOps (source control, continuous integration,

deployments, etc.)

 Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD

 Hands on ETL development experience, preferably using or SSIS

 SQL Server experience required

 Strong analytical skills to solve and model complex business requirements

 Sound understanding of BI Best Practices/Methodologies, relational structures,

dimensional data modelling, structured query language (SQL) skills, data

warehouse and reporting techniques

Preferred Skills

Required

 Experience working in the SCRUM Environment.

 Experience in Administration (Windows/Unix/Network/Database/Hadoop) is a

plus.

 Experience in SQL Server, SSIS, SSAS, SSRS

 Comfortable with creating data models and visualization using Power BI

 Hands on experience in relational and multi-dimensional data modelling,

including multiple source systems from databases and flat files, and the use of

standard data modelling tools

 Ability to collaborate on a team with infrastructure, BI report development and

business analyst resources, and clearly communicate solutions to both

technical and non-technical team members

Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Vidyashree Kulkarni
Posted by Vidyashree Kulkarni
Remote only
9 - 15 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.

Job responsibilities
  • You will partner with teammates to create complex data processing pipelines in order to solve our clients' most complex challenges
  • You will collaborate with Data Scientists in order to design scalable implementations of their models
  • You will pair to write clean and iterative code based on TDD
  • Leverage various continuous delivery practices to deploy, support and operate data pipelines
  • Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
  • Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches
  • Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process
  • Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes
Job qualifications

Technical skills

  • You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
  • You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
  • Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
  • You are comfortable taking data-driven approaches and applying data security strategy to solve business problems
  • Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
  • You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
  • Professional skills
  • You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives
  • An interest in coaching, sharing your experience and knowledge with teammates
  • You enjoy influencing others and always advocate for technical excellence while being open to change when needed
  • Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more
Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹15L / yr
MySQL
skill iconMongoDB
Data modeling
API
Apache Kafka
+2 more

What is the role?

You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.

Key Responsibilities

  • Design and develop resilient data pipelines.
  • Write efficient queries to fetch data from the report database.
  • Work closely with application backend engineers on data requirements for their stories.
  • Designing and developing report APIs for the front end to consume.
  • Focus on building highly available, fault-tolerant report systems.
  • Constantly improve the architecture of the application by clearing the technical backlog. 
  • Adopt a culture of learning and development to constantly keep pace with and adopt new technolgies.

What are we looking for?

An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.

  • Education - BE/MCA or equivalent
  • Overall 8+ years of experience
  • Expert level understanding of database concepts and BI.
  • Well verse in databases such as MySQL, MongoDB and hands on experience in creating data models. 
  • Must have designed and implemented low latency data warehouse systems.
  • Must have strong understanding of Kafka and related systems.
  • Experience in clickhouse database preferred.
  • Must have good knowledge of APIs and should be able to build interfaces for frontend engineers.
  • Should be innovative and communicative in approach
  • Will be responsible for functional/technical track of a project

Whom will you work with?

You will work with a top-notch tech team, working closely with the CTO and product team.  

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

 

Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Wellness Forever Medicare Private Limited
Mumbai
3 - 5 yrs
₹7L - ₹11L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL server
Microsoft Windows Azure
+4 more
  • Minimum 3-4 years of experience with ETL tools, SQL, SSAS & SSIS
  • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes
  • Knowledge of programming languages eg. JASON, Python, R
  • Hands on experience of SQL database design
  • Experience working with REST API
  • Influencing and supporting project delivery through involvement in project/sprint planning and QA
  • Working experience with Azure
  • Stakeholder management
  • Good communication skills
Read more
Crayon Data

at Crayon Data

2 recruiters
Varnisha Sethupathi
Posted by Varnisha Sethupathi
Chennai
5 - 8 yrs
₹15L - ₹25L / yr
SQL
skill iconPython
Analytical Skills
Data modeling
Data Visualization
+1 more

Role : Senior Customer Scientist 

Experience : 6-8 Years 

Location : Chennai (Hybrid) 
 
 

Who are we? 
 
 

A young, fast-growing AI and big data company, with an ambitious vision to simplify the world’s choices. Our clients are top-tier enterprises in the banking, e-commerce and travel spaces. They use our core AI-based choice engine http://maya.ai/">maya.ai, to deliver personal digital experiences centered around taste. The http://maya.ai/">maya.ai platform now touches over 125M customers globally. You’ll find Crayon Boxes in Chennai and Singapore. But you’ll find Crayons in every corner of the world. Especially where our client projects are – UAE, India, SE Asia and pretty soon the US. 
 
 

Life in the Crayon Box is a little chaotic, largely dynamic and keeps us on our toes! Crayons are a diverse and passionate bunch. Challenges excite us. Our mission drives us. And good food, caffeine (for the most part) and youthful energy fuel us. Over the last year alone, Crayon has seen a growth rate of 3x, and we believe this is just the start. 
 

 
We’re looking for young and young-at-heart professionals with a relentless drive to help Crayon double its growth. Leaders, doers, innovators, dreamers, implementers and eccentric visionaries, we have a place for you all. 
 

 
 

Can you say “Yes, I have!” to the below? 
 
 

  1. Experience with exploratory analysis, statistical analysis, and model development 
     
  1. Knowledge of advanced analytics techniques, including Predictive Modelling (Logistic regression), segmentation, forecasting, data mining, and optimizations 
     
  1. Knowledge of software packages such as SAS, R, Rapidminer for analytical modelling and data management. 
     
  1. Strong experience in SQL/ Python/R working efficiently at scale with large data sets 
     
  1. Experience in using Business Intelligence tools such as PowerBI, Tableau, Metabase for business applications 
     

 
 

 

Can you say “Yes, I will!” to the below? 
 
 

  1. Drive clarity and solve ambiguous, challenging business problems using data-driven approaches. Propose and own data analysis (including modelling, coding, analytics) to drive business insight and facilitate decisions. 
     
  1. Develop creative solutions and build prototypes to business problems using algorithms based on machine learning, statistics, and optimisation, and work with engineering to deploy those algorithms and create impact in production. 
     
  1. Perform time-series analyses, hypothesis testing, and causal analyses to statistically assess the relative impact and extract trends 
     
  1. Coordinate individual teams to fulfil client requirements and manage deliverable 
     
  1. Communicate and present complex concepts to business audiences 
     
  1. Travel to client locations when necessary  

 

 

Crayon is an equal opportunity employer. Employment is based on a person's merit and qualifications and professional competences. Crayon does not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related.  
 
 

More about Crayon: https://www.crayondata.com/">https://www.crayondata.com/  
 

More about http://maya.ai/">maya.ai: https://maya.ai/">https://maya.ai/  

 

 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Lokesh Manikappa
Posted by Lokesh Manikappa
Bengaluru (Bangalore)
5 - 12 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data modeling
Spark
+5 more

Job Description

The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.

Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.

You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies

 

Skills /Expertise Required :

Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).

Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.

Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.

Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills

Read more
A fast-growing SaaS commerce company permanent WFH & Office

A fast-growing SaaS commerce company permanent WFH & Office

Agency job
via Jobdost by Mamatha A
Remote only
10 - 15 yrs
₹40L - ₹60L / yr
Engineering Management
skill iconNodeJS (Node.js)
skill iconReact Native
GraphQL
skill iconElastic Search
+9 more

Job Description

 

What is the role?

Expected to manage the product plan, engineering, and delivery of Plum Integration activities. Plum is a rewarding and incentive infrastructure for businesses. It's a unified integrated suite of products to handle various rewarding use cases for consumers, sales, channel partners, and employees. 31% of the total tech team is aligned towards this product and comprises 32 members within Plum Tech, Quality, Design, and Product management. The annual FY 2019-20 revenue for Plum was $ 40MN and is showing high growth potential this year as well. The product has a good mix of both domestic and international clientele and is expanding. The role will be based out of our head office in Bangalore, Karnataka however we are open to discuss the option of remote working with 25 - 50% travel.

Key Responsibilities

  • Scope and lead technology with the right product and business metrics.
  • Directly contribute to product development by writing code if required.
  • Architect systems for scale and stability.
  • Serve as a role model for our high engineering standards and bring consistency to the many codebases and processes you will encounter.
  • Collaborate with stakeholders across disciplines like sales, customers, product, design, and customer success.
  • Code reviews and feedback.
  • Build simple solutions and designs over complex ones and have a good intuition for what is lasting and scalable.
  • Define a process for maintaining a healthy engineering culture (Cadence for one-on-ones, meeting structures, HLDs, Best Practices In development, etc.).

What are we looking for?

  • Manage a senior tech team of more than 5 direct and 10 indirect developers.
  • Should have experience in handling e-commerce applications at scale.
  • Should have experience working with applications like HubSpot salesforce and other CRM.
  • Should have experience in B2B integrations.
  • Should have at least 10+ years of experience in software development, agile processes for international e-commerce businesses.
  • Should be extremely hands-on, with an Automate as much as possible mind set full-stack developer.
  • Should exhibit skills to build a good engineering team and culture.
  • Should be able to handle the chaos with product planning, prioritizing, customer-first approach.
  • Technical proficiency
  • Frameworks like React, React Native, Node.js, GraphQL
  • Databases technologies like Elasticsearch, Redis, MySQL, MongoDB, Kafka
  • Dev ops to manage and architect infra - AWS, CI/CD (Jenkins)
  • System Architecture w.r.t Microservices, Cloud Development, DB Administration, Data Modeling
  • Understanding of security principles and possible attacks and mitigate them.

Whom will you work with?

You will lead the Plum Integration Engineering team and work in close conjunction with the Tech leads of Plum with some cross-functional stake with other products. Your will report to the CTO directly.

‍What can you look for?

‍ A wholesome opportunity in a fast-paced environment with scale, international flavor, backend, and frontend. Work with a team of highly talented young professionals and enjoy the benefits.
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
A Pre-series A funded FinTech Company

A Pre-series A funded FinTech Company

Agency job
via GoHyre by Avik Majumder
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
Digital B2B Platform

Digital B2B Platform

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
6 - 12 yrs
₹60L - ₹80L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Problem solving
+3 more

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.

 

Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

● Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Digital B2B Platform

Digital B2B Platform

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
3 - 5 yrs
₹30L - ₹45L / yr
Data modeling
skill iconJava
Spring
Microservices
SQL

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.


Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Silverline India

at Silverline India

3 recruiters
Prasad Dubbewar
Posted by Prasad Dubbewar
Remote only
3 - 5 yrs
₹14L - ₹15L / yr
SOQL
Salesforce
Salesforce Lightning
Salesforce Apex
SOSL
+1 more
COMPANY OVERVIEW
Do you want to be part of a fast-paced environment, supporting the growth of cutting edge technology in
cloud computing?
Silverline employees are passionate about what they do. We’re a diverse group of cloud technology
professionals with the same goal: to make our clients’ lives easier through technology and we have a
good time while we’re at it!
As a Salesforce Platinum Cloud Alliance Partner, Silverline combines Strategic Advisory, technical
implementation, and ongoing Managed Services to enable organizations to achieve maximum value with
the Salesforce platform. Silverline is the #1 Company to Work for among small and medium size
companies in this years annual Glassdoor Employees’ Choice Awards, a list of the Best Places to Work in
2018. Come be a part of our team!

Job Summary
A Senior Developer is responsible for writing clean and flawless code to produce fully functional modules
or apps according to the requirements; as part of day-to-day work. A sr. developer is expected to possess
expert level knowledge of Force.com platform and an analytical mindset with a keen eye for detail.

A Senior Developer needs to conduct following responsibilities at work -
1. Billability of at least 40 hrs per week (Util Target). Resource needs to ensure that assigned hours
are utilized on each project.
2. Perform thorough analysis on requirements i.e.
a. Obtain a deep understanding of the desired output
b. Check whether the implementation is possible or not; respecting the Salesforce
environment’s limitations or governors
c. Evaluate whether stated requirement (or a part) can be achieved via configuration before
opting for coding
A successful analysis results in -
● Derivation of all possible use cases
● Identification of any blockers, challenges or risks
● An optimal code design solution of complex requirements
● Thorough impact analysis of the planned development
A Senior developer must discuss the approach with TL/TM or an onshore TA and finalize the ETAs.
3. Develop high quality scalable code by
a. Following Salesforce.com and Silverline-crm standards/best practices
b. Leverage recommended frameworks (ex. Test framework or Trigger framework)
c. Write comprehensive test code asserting all possible use cases
d. Perform thorough unit tests
e. Fix all bugs reported from QA/UAT
f. Work on deployments through recommended tools and methodologies.
g. Time bound R&D for any unknown or newer areas
h. Developing reusable code components in a project
i. Mentor, assist and review peer developers within the teams
j. Document CDDDs
4. Follow devops processes i.e.
a. Use recommended IDEs
b. Ensure daily code check-in/check-out
c. Stay in compliance with Clayton and code review feedback points shared by TL/TM.
5. Facilitate proactive communication with project/product teams i.e.
a. Use all relevant channels (Emails, JIRA, Slack or Zoom meetings)
b. Be in sync with sprint/project/product teams
c. Share your plan of action during the start of your day
d. Send DSRs by end of the day
e. Ensure decent overlap/availability in onshore team’s timezone (whenever required).
6. Focus on learning & development by
a. Setting OKRs for the quarter in alignment with the Company goals.
b. Constantly perform self review and identify improvement areas
c. Seek timely assistance from Reporting Managers
d. Actively contribute on knowledge sharing initiatives and help in skill development of peer
developers.
e. Enhance skills and capability in USABILITY focussed development
7. Exhibit good interpersonal skills by being a team player, staying approachable and collaborative.
8. Obtain and maintain certifications related to Salesforce, nCino, vlocity, Mulesoft, Marketing
Cloud.

Required Experience and Qualifications
● Required - Minimum 4 - 7 years experience in Software Development
● Required - Minimum 3 years experience with SFDC
● Required - Good in communication and interpersonal skills
● Required - Certified Salesforce - Platform Developer I
● Required - Minimum Bachelor's Degree in Engineering or Science

Preferred Qualifications
● Certified nCino Advanced Commercial/Retail Developer
● Certified vlocity Developer
● Certified Mulesoft Developer Certified - Level 1
● Certified Marketing Cloud Email Specialist/Developer
● Certified Salesforce - Platform Developer II

Benefits
Industry leading benefits including Health Insurance offered. Will be part of a US based/headquartered company.

Job Type:
Full-time(Flexible with Remote working, if outside Bangalore
Read more
Leading Manufacturing Company

Leading Manufacturing Company

Agency job
Chennai
3 - 6 yrs
₹3L - ₹8L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Data modeling
skill iconData Analytics
+2 more

Location:  Chennai
Education: BE/BTech
Experience: Minimum 3+ years of experience as a Data Scientist/Data Engineer

Domain knowledge: Data cleaning, modelling, analytics, statistics, machine learning, AI

Requirements:

  • To be part of Digital Manufacturing and Industrie 4.0 projects across client group of companies
  • Design and develop AI//ML models to be deployed across factories
  • Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required
  • Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks
  • Prior experience in developing AI and ML models is required
  • Experience with data from the Manufacturing Industry would be a plus

Roles and Responsibilities:

  • Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics
  • Multitasking, good communication necessary
  • Entrepreneurial attitude

Additional Information:

  • Travel:                                  Must be willing to travel on shorter duration within India and abroad
  • Job Location:                      Chennai
  • Reporting to:                      Team Leader, Energy Management System
Read more
Information Technology Services

Information Technology Services

Agency job
via Jobdost by Sathish Kumar
Pune
5 - 8 yrs
₹11L - ₹30L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+14 more

 Sr. Java Software Engineer:

Preferred Education & Experience:

  • Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
  • Well-versed in and 5+ years of hands-on designing experience in Object Oriented Design, Data Modeling, Class & Object Modeling, Microservices Architecture & Design.
  • Well-versed in and 5+ years of hands-on programming experience in Core Java Programming, Advanced Java Programming, Spring Framework, Spring Boot or Micronaut Framework, Log Framework, Build & Deployment Framework, etc

. • 3+ years of hands-on experience developing Domain-Driven Microservices using libraries & frameworks such as Micronaut, Spring Boot, etc.

  • 3+ years of hands-on experience developing connector frameworks Apache Camel, Akka framework, etc.
  • 3+ years of hands-on experience in RBDMS & NoSQL Databases concepts and development practices (PostgreSQL, MongoDB, Elasticsearch, Amazon S3).
  • 3+ years of hands-on experience developing Webservices using REST, API Gateway using Token based authentication, access management.
  • 1+ years of hands-on experience developing and hosting microservices using Serverless and Container based development (AWS Lambda, Docker, Kubernetes, etc.).
  • Having Knowledge & hands-on experience developing applications using Behavior Driven Development, Test Driven Development Methodologies is a Plus.
  • Having Knowledge & hands-on experience in AWS Cloud Services such as IAM, Lambda, EC2, ECS, ECR, API Gateway, S3, SQS, Kinesis, CloudWatch, DynamoDB, etc. is also a Plus.
  • Having Knowledge & hands-on experience in DevOps CI/CD tools such as JIRA, Git (Bitbucket/GitHub), Artifactory, etc. & Build tools such as Maven & Gradle.
  • 2+ years of hands-on development experience in Java centric Developer Tools, Management & Governance, Networking and Content Delivery, Security, Identity, and Compliance, etc.
  • Having Knowledge & hands-on experience in Apache Nifi, Apache Spark, Apache Flink is also a Plus. • Having Knowledge & handson experience in Python, NodeJS, Scala Programming is also a Plus. Required Experience: 5+ Years

Job Location: Remote / Pune

Open Positions: 1

Read more
Elastic

at Elastic

5 recruiters
Nikita Rathi
Posted by Nikita Rathi
Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹30L / yr
Payment gateways
Payment processing
Compensation & Benefits
Compensation
HR analytics
+9 more
Elastic is a search company built on a free and open heritage. Anyone can use Elastic products and solutions to get started quickly and frictionlessly. Elastic offers three solutions for enterprise search, observability, and security, built on one technology stack that can be deployed anywhere. From finding documents to monitoring infrastructure to hunting for threats, Elastic makes data usable in real time and at scale. Thousands of organizations worldwide, including Cisco, eBay, Goldman Sachs, Microsoft, The Mayo Clinic, NASA, The New York Times, Wikipedia, and Verizon, use Elastic to power mission-critical systems. Founded in 2012, Elastic is a distributed company with Elasticians around the globe and is publicly traded on the NYSE under the symbol ESTC. Learn more at elastic.co.

We’re looking for Compensation Advisor to help drive our compensation philosophy and programs to support our overall people strategies.

What You Will Be Doing:
  • Support our Corporate functions with all elements of general compensation including job evaluation & classification within the Elastic job infrastructure
  • Create and present comprehensive analysis to assess compensation issues in the business and provide market competitive and business aligned solutions using internal and external benchmark data sources
  • Partner and provide thought leadership to HR Business Partners, Recruiting, and line management on the administration of all Elastic compensation programs and policies
  • Assist in the planning and management of annual compensation processes and programs (e.g. Annual Compensation Cycle)
  • Design and build compensation metrics, reports, and tools to inform compensation program decisions and forecast, report, and/or analyze compensation business outcomes
  • Partner with other Global Compensation team members and broader HR team in the design, development and maintenance of various compensation policies and programs
  • Evangelize Elastic’s Compensation and Total Rewards Strategy
What You Bring Along:
  • A real passion for fast-paced and dynamic environments, the ability to thrive in ambiguity and devote a diversified global view to all you do
  • Experience with compensation design and administration
  • A dedication to think big, use data to drive strategy, challenge convention, and potentially reinvent how work is done
  • A proficiency in optimizing compensation processes and programs with a keen understanding of the balance between structure and flexibility
  • The dexterity to balance critical, long term thinking about how we scale and optimize our rapidly growing employee base in tandem with the development and execution of global people strategies
And some of the basics:
  • Bachelor's Degree and 5+ years of related experience; or 3 years with a Master’s degree
  • 3+ years of experience in compensation analysis, financial analysis, statistical analysis, and/or data modeling is required
  • Demonstrable mastery with Excel and Google suite
  • Experience working with HR Business Partners or similar in a business facing role to solve compensation challenges.
  • Experience working in a hyper-growth, global organization and exposure to compensation programs at scale
  • Strategic, analytical, critical thinker who isn’t afraid to get into the details; real passion for problem solving
  • Excellent written and verbal communication and presentation skills
  • High degree of integrity and honesty; ability to exercise confidentiality and neutrality in complex and sensitive situations
  • A sense of humor and ability to roll with the punches definitely a plus.


Additional Information - We Take Care Of Our People

As a distributed company, diversity drives our identity. Whether you’re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn’t matter if you’re just out of college or your children are; we need you for what you can do.

We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do.
  • Competitive pay based on the work you do here and not your previous salary
  • Health coverage for you and your family in many locations
  • Ability to craft your calendar with flexible locations and schedules for many roles
  • Generous number of vacation days each year
  • Double your charitable giving - We match up to $1500 (or local currency equivalent)
  • Up to 40 hours each year to use toward volunteer projects you love
  • Embracing parenthood with minimum of 16 weeks of parental leave
 
 
Read more
A global provider of Business Process Management company

A global provider of Business Process Management company

Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore), UK
5 - 10 yrs
₹15L - ₹25L / yr
Data Visualization
PowerBI
ADF
Business Intelligence (BI)
PySpark
+11 more

Power BI Developer

Senior visualization engineer with 5 years’ experience in Power BI to develop and deliver solutions that enable delivery of information to audiences in support of key business processes. In addition, Hands-on experience on Azure data services like ADF and databricks is a must.

Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business, and technical counterparts.

Candidates should have worked in agile development environments.

Desired Competencies:

  • Should have minimum of 3 years project experience using Power BI on Azure stack.
  • Should have good understanding and working knowledge of Data Warehouse and Data Modelling.
  • Good hands-on experience of Power BI
  • Hands-on experience T-SQL/ DAX/ MDX/ SSIS
  • Data Warehousing on SQL Server (preferably 2016)
  • Experience in Azure Data Services – ADF, DataBricks & PySpark
  • Manage own workload with minimum supervision.
  • Take responsibility of projects or issues assigned to them
  • Be personable, flexible and a team player
  • Good written and verbal communications
  • Have a strong personality who will be able to operate directly with users
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort