Cutshort logo
Data modeling jobs

50+ Data modeling Jobs in India

Apply to 50+ Data modeling Jobs on CutShort.io. Find your next job, effortlessly. Browse Data modeling Jobs and apply today!

icon
Cymetrix Software

at Cymetrix Software

2 candid answers
Netra Shettigar
Posted by Netra Shettigar
Chennai, Bengaluru (Bangalore)
5 - 10 yrs
₹15L - ₹26L / yr
Google Cloud Platform (GCP)
bigquery
Data modeling
Snow flake schema
OLTP
+1 more

1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, BQ optimization, Airflow/Composer, Python(preferred)/Java

2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At Least 2 databases)

5. Data Warehouse concepts - Beginner to Intermediate level

6.Data Modeling, GCP Databases, DB Schema(or similar)

7.Hands-on data modelling for OLTP and OLAP systems

8.In-depth knowledge of Conceptual, Logical and Physical data modelling

9.Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same

10.Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.

11.Should have working experience on at least one data modelling tool,

preferably DBSchema, Erwin

12Good understanding of GCP databases like AlloyDB, CloudSQL, and

BigQuery.

13.People with functional knowledge of the mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory


Role & Responsibilities:

● Work with business users and other stakeholders to understand business processes.

● Ability to design and implement Dimensional and Fact tables

● Identify and implement data transformation/cleansing requirements

● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse

● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions

● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique

● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.

● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.

● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.

● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.

● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.

● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.

● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.

● Train business end-users, IT analysts, and developers.

Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
7 - 14 yrs
₹20L - ₹30L / yr
Solution architecture
skill iconData Analytics
Data architecture
Data Warehouse (DWH)
Enterprise Data Warehouse (EDW)
+21 more

As a Solution Architect, you will collaborate with our sales, presales and COE teams to provide technical expertise and support throughout the new business acquisition process. You will play a crucial role in understanding customer requirements, presenting our solutions, and demonstrating the value of our products.


You thrive in high-pressure environments, maintaining a positive outlook and understanding that career growth is a journey that requires making strategic choices. You possess good communication skills, both written and verbal, enabling you to convey complex technical concepts clearly and effectively. You are a team player, customer-focused, self-motivated, responsible individual who can work under pressure with a positive attitude. You must have experience in managing and handling RFPs/ RFIs, client demos and presentations, and converting opportunities into winning bids. You possess a strong work ethic, positive attitude, and enthusiasm to embrace new challenges. You can multi-task and prioritize (good time management skills), willing to display and learn. You should be able to work independently with less or no supervision. You should be process-oriented, have a methodical approach and demonstrate a quality-first approach.


Ability to convert client’s business challenges/ priorities into winning proposal/ bid through excellence in technical solution will be the key performance indicator for this role.


What you’ll do

  • Architecture & Design: Develop high-level architecture designs for scalable, secure, and robust solutions.
  • Technology Evaluation: Select appropriate technologies, frameworks, and platforms for business needs.
  • Cloud & Infrastructure: Design cloud-native, hybrid, or on-premises solutions using AWS, Azure, or GCP.
  • Integration: Ensure seamless integration between various enterprise applications, APIs, and third-party services.
  • Design and develop scalable, secure, and performant data architectures on Microsoft Azure and/or new generation analytics platform like MS Fabric.
  • Translate business needs into technical solutions by designing secure, scalable, and performant data architectures on cloud platforms.
  • Select and recommend appropriate Data services (e.g. Fabric, Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, Power BI etc) to meet specific data storage, processing, and analytics needs.
  • Develop and recommend data models that optimize data access and querying. Design and implement data pipelines for efficient data extraction, transformation, and loading (ETL/ELT) processes.
  • Ability to understand Conceptual/Logical/Physical Data Modelling.
  • Choose and implement appropriate data storage, processing, and analytics services based on specific data needs (e.g., data lakes, data warehouses, data pipelines).
  • Understand and recommend data governance practices, including data lineage tracking, access control, and data quality monitoring.



What you will Bring 

  • 10+ years of working in data analytics and AI technologies from consulting, implementation and design perspectives
  • Certifications in data engineering, analytics, cloud, AI will be a certain advantage
  • Bachelor’s in engineering/ technology or an MCA from a reputed college is a must
  • Prior experience of working as a solution architect during presales cycle will be an advantage


Soft Skills

  • Communication Skills
  • Presentation Skills
  • Flexible and Hard-working


Technical Skills

  • Knowledge of Presales Processes
  • Basic understanding of business analytics and AI
  • High IQ and EQ


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.
Read more
Premier global software products and services firm

Premier global software products and services firm

Agency job
via Recruiting Bond by Pavan Kumar
Hyderabad, Ahmedabad, Indore
5 - 10 yrs
₹10L - ₹20L / yr
Data engineering
Data modeling
Database Design
Data Warehouse (DWH)
Datawarehousing
+9 more

Job Summary: 

As a Data Engineering Lead, your role will involve designing, developing, and implementing interactive dashboards and reports using data engineering tools. You will work closely with stakeholders to gather requirements and translate them into effective data visualizations that provide valuable insights. Additionally, you will be responsible for extracting, transforming, and loading data from multiple sources into Power BI, ensuring its accuracy and integrity. Your expertise in Power BI and data analytics will contribute to informed decision-making and support the organization in driving data-centric strategies and initiatives.


We are looking for you!

As an ideal candidate for the Data Engineering Lead position, you embody the qualities of a team player with a relentless get-it-done attitude. Your intellectual curiosity and customer focus drive you to continuously seek new ways to add value to your job accomplishments.


You thrive under pressure, maintaining a positive attitude and understanding that your career is a journey. You are willing to make the right choices to support your growth. In addition to your excellent communication skills, both written and verbal, you have a proven ability to create visually compelling designs using tools like Power BI and Tableau that effectively communicate our core values. 


You build high-performing, scalable, enterprise-grade applications and teams. Your creativity and proactive nature enable you to think differently, find innovative solutions, deliver high-quality outputs, and ensure customers remain referenceable. With over eight years of experience in data engineering, you possess a strong sense of self-motivation and take ownership of your responsibilities. You prefer to work independently with little to no supervision. 


You are process-oriented, adopt a methodical approach, and demonstrate a quality-first mindset. You have led mid to large-size teams and accounts, consistently using constructive feedback mechanisms to improve productivity, accountability, and performance within the team. Your track record showcases your results-driven approach, as you have consistently delivered successful projects with customer case studies published on public platforms. Overall, you possess a unique combination of skills, qualities, and experiences that make you an ideal fit to lead our data engineering team(s).


You value inclusivity and want to join a culture that empowers you to show up as your authentic self. 


You know that success hinges on commitment, our differences make us stronger, and the finish line is always sweeter when the whole team crosses together. In your role, you should be driving the team using data, data, and more data. You will manage multiple teams, oversee agile stories and their statuses, handle escalations and mitigations, plan ahead, identify hiring needs, collaborate with recruitment teams for hiring, enable sales with pre-sales teams, and work closely with development managers/leads for solutioning and delivery statuses, as well as architects for technology research and solutions.


What You Will Do: 

  • Analyze Business Requirements.
  • Analyze the Data Model and do GAP analysis with Business Requirements and Power BI. Design and Model Power BI schema.
  • Transformation of Data in Power BI/SQL/ETL Tool.
  • Create DAX Formula, Reports, and Dashboards. Able to write DAX formulas.
  • Experience writing SQL Queries and stored procedures.
  • Design effective Power BI solutions based on business requirements.
  • Manage a team of Power BI developers and guide their work.
  • Integrate data from various sources into Power BI for analysis.
  • Optimize performance of reports and dashboards for smooth usage.
  • Collaborate with stakeholders to align Power BI projects with goals.
  • Knowledge of Data Warehousing(must), Data Engineering is a plus


What we need?

  • B. Tech computer science or equivalent
  • Minimum 5+ years of relevant experience 


Why join us?

  • Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
  • Gain hands-on experience in content marketing with exposure to real-world projects.
  • Opportunity to learn from experienced professionals and enhance your marketing skills.
  • Contribute to exciting initiatives and make an impact from day one.
  • Competitive stipend and potential for growth within the company.
  • Recognized for excellence in data and AI solutions with industry awards and accolades.


Read more
MindInventory

at MindInventory

1 video
4 recruiters
Khushi Bhatt
Posted by Khushi Bhatt
Ahmedabad
3 - 5 yrs
₹4L - ₹12L / yr
Data engineering
ETL
Google Cloud Platform (GCP)
Apache Airflow
Snow flake schema
+3 more
  • Required Minimum 3 years of Experience as a Data Engineer
  • Database Knowledge: Experience with Timeseries and Graph Database is must along with SQL, PostgreSQL, MySQL, or NoSQL databases like FireStore, MongoDB,
  • Data Pipelines: Understanding data Pipeline process like ETL, ELT, Streaming Pipelines with tools like AWS Glue, Google Dataflow, Apache Airflow, Apache NiFi.
  • Data Modeling: Knowledge of Snowflake Schema, Fact & Dimension Tables.
  • Data Warehousing Tools: Experience with Google BigQuery, Snowflake, Databricks
  • Performance Optimization: Indexing, partitioning, caching, query optimization techniques.
  • Python or SQL Scripting: Ability to write scripts for data processing and automation


Read more
Auxo AI
kusuma Gullamajji
Posted by kusuma Gullamajji
Hyderabad, Bengaluru (Bangalore), Mumbai, Gurugram
7 - 12 yrs
₹20L - ₹40L / yr
Microsoft Windows Azure
ETL
Data modeling
Data governance

Job Description :


As a Data & Analytics Architect, you will lead key data initiatives, including cloud transformation, data governance, and AI projects. You'll define cloud architectures, guide data science teams in model development, and ensure alignment with data architecture principles across complex solutions. Additionally, you will create and govern architectural blueprints, ensuring standards are met and promoting best practices for data integration and consumption.


Responsibilities :


- Play a key role in driving a number of data and analytics initiatives including cloud data transformation, data governance, data quality, data standards, CRM, MDM, Generative AI and data science.


- Define cloud reference architectures to promote reusable patterns and promote best practices for data integration and consumption.


- Guide the data science team in implementing data models and analytics models.


- Serve as a data science architect delivering technology and architecture services to the data science community.


- In addition, you will also guide application development teams in the data design of complex solutions, in a large data eco-system, and ensure that teams are in alignment with the data architecture principles, standards, strategies, and target states.


- Create, maintain, and govern architectural views and blueprints depicting the Business and IT landscape in its current, transitional, and future state.


- Define and maintain standards for artifacts containing architectural content within the operating model.


Requirements :


- Strong cloud data architecture knowledge (preference for Microsoft Azure)


- 8-10+ years of experience in data architecture, with proven experience in cloud data transformation, MDM, data governance, and data science capabilities.


- Design reusable data architecture and best practices to support batch/streaming ingestion, efficient batch, real-time, and near real-time integration/ETL, integrating quality rules, and structuring data for analytic consumption by end uses.


- Ability to lead software evaluations including RFP development, capabilities assessment, formal scoring models, and delivery of executive presentations supporting a final - recommendation.


- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Standards, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, non-traditional data and multi-media, ETL, ESB).


- Experience with cloud data technologies such as Azure data factory, Azure Data Fabric, Azure storage, Azure data lake storage, Azure data bricks, Azure AD, Azure ML etc.


- Experience with big data technologies such as Cloudera, Spark, Sqoop, Hive, HDFS, Flume, Storm, and Kafka.

Read more
Moative

at Moative

3 candid answers
Eman Khan
Posted by Eman Khan
Chennai
1 - 5 yrs
₹15L - ₹30L / yr
NumPy
pandas
Scikit-Learn
Natural Language Toolkit (NLTK)
skill iconMachine Learning (ML)
+7 more

About Moative

Moative, an Applied AI Services company, designs AI roadmaps, builds co-pilots and predictive AI solutions for companies in energy, utilities, packaging, commerce, and other primary industries. Through Moative Labs, we aspire to build micro-products and launch AI startups in vertical markets.


Our Past: We have built and sold two companies, one of which was an AI company. Our founders and leaders are Math PhDs, Ivy League University Alumni, Ex-Googlers, and successful entrepreneurs. 


Role

We seek skilled and experienced data science/machine learning professionals with a strong background in at least one of mathematics, financial engineering, and electrical engineering, to join our Energy & Utilities team. If you are interested in artificial intelligence, excited about solving real business problems in the energy and utilities industry, and keen to contribute to impactful projects, this role is for you!


Work you’ll do

As a data scientist in the energy and utilities industry, you will perform quantitative analysis and build mathematical models to forecast energy demand, supply and strategies of efficient load balancing. You will work on models for short term and long term pricing, improving operational efficiency, reducing costs, and ensuring reliable power supply. You’ll work closely with cross-functional teams to deploy these models in solutions that provide insights/ solutions to real-world business problems. You will also be involved in conducting experiments, building POCs and prototypes.


Responsibilities

  • Develop and implement quantitative models for load forecasting, energy production and distribution optimization.
  • Analyze historical data to identify and predict extreme events, and measure impact of extreme events. Enhance existing pricing and risk management frameworks.
  • Develop and implement quantitative models for energy pricing and risk management. Monitor market conditions and adjust models as needed to ensure accuracy and effectiveness.
  • Collaborate with engineering and operations teams to provide quantitative support for energy projects. Enhance existing energy management systems and develop new strategies for energy conservation.
  • Maintain and improve quantitative tools and software used in energy management.
  • Support end-to-end ML/ AI model lifecycle - from data preparation, data analysis and feature engineering to model development, validation and deployment
  • Collaborate with domain experts, engineers, and stakeholders in translating business problems into data-driven solutions
  • Document methodologies and results, present findings and communicate insights to non-technical audiences


Skills & Requirements

  • Strong background in mathematics, econometrics, electrical engineering, or a related eld.
  • Experience data analysis, and quantitative modeling using programming languages such as Python or R.
  • Excellent analytical and problem-solving skills.
  • Strong understanding and experience with data analysis, statistical and mathematical concepts and ML algorithms
  • Proficiency in Python and familiarity with basic Python libraries for data analysis and ML algorithms (such as NumPy, Pandas, ScikitLearn, NLTK).
  • Strong communication skills
  • Strong collaboration skills, ability to work with engineering and operations teams.
  • A continuous learning attitude and a problem solving mind-set

Good to have -

  • Knowledge of energy markets, regulations, and utility operation.
  • Working knowledge of cloud platforms (e.g., AWS, Azure, GCP).
  • Broad understanding of data structures and data engineering.


Working at Moative

Moative is a young company, but we believe strongly in thinking long-term, while acting with urgency. Our ethos is rooted in innovation, efficiency and high-quality outcomes. We believe the future of work is AI-augmented and boundary less. Here are some of our guiding principles:


  • Think in decades. Act in hours. As an independent company, our moat is time. While our decisions are for the long-term horizon, our execution will be fast – measured in hours and days, not weeks and months.
  • Own the canvas. Throw yourself in to build, x or improve – anything that isn’t done right, irrespective of who did it. Be selfish about improving across the organization – because once the rot sets in, we waste years in surgery and recovery.
  • Use data or don’t use data. Use data where you ought to but not as a ‘cover-my-back’ political tool. Be capable of making decisions with partial or limited data. Get better at intuition and pattern-matching. Whichever way you go, be mostly right about it.
  • Avoid work about work. Process creeps on purpose, unless we constantly question it. We are deliberate about committing to rituals that take time away from the actual work. We truly believe that a meeting that could be an email, should be an email and you don’t need a person with the highest title to say that loud.
  • High revenue per person. We work backwards from this metric. Our default is to automate instead of hiring. We multi-skill our people to own more outcomes than hiring someone who has less to do. We don’t like squatting and hoarding that comes in the form of hiring for growth. High revenue per person comes from high quality work from everyone. We demand it.


If this role and our work is of interest to you, please apply here. We encourage you to apply even if you believe you do not meet all the requirements listed above.


That said, you should demonstrate that you are in the 90th percentile or above. This may mean that you have studied in top-notch institutions, won competitions that are intellectually demanding, built something of your own, or rated as an outstanding performer by your current or previous employers.


The position is based out of Chennai. Our work currently involves significant in-person collaboration and we expect you to work out of our offices in Chennai.

Read more
Remote only
7 - 12 yrs
₹20L - ₹35L / yr
Snowflake
Looker / LookML
Data engineering
skill iconAmazon Web Services (AWS)
Data modeling

Role & Responsibilities

Data Organization and Governance: Define and maintain governance standards that span multiple systems (AWS, Fivetran, Snowflake, PostgreSQL, Salesforce/nCino, Looker), ensuring that data remains accurate, accessible, and organized across the organization.

Solve Data Problems Proactively: Address recurring data issues that sidetrack operational and strategic initiatives by implementing processes and tools to anticipate, identify, and resolve root causes effectively.

System Integration: Lead the integration of diverse systems into a cohesive data environment, optimizing workflows and minimizing manual intervention.

Hands-On Problem Solving: Take a hands-on approach to resolving reporting issues and troubleshooting data challenges when necessary, ensuring minimal disruption to business operations.

Collaboration Across Teams: Work closely with business and technical stakeholders to understand and solve our biggest challenges

Mentorship and Leadership: Guide and mentor team members, fostering a culture of accountability and excellence in data management practices.

Strategic Data Support: Ensure that marketing, analytics, and other strategic initiatives are not derailed by data integrity issues, enabling the organization to focus on growth and innovation.

Read more
Chennai
4 - 6 yrs
₹1L - ₹16L / yr
skill iconPython
SQL
Spark
ETL
Apache Airflow
+3 more


We are looking for skilled Data Engineer to design, build, and maintain robust data pipelines and infrastructure. You will play a pivotal role in optimizing data flow, ensuring scalability, and enabling seamless access to structured/unstructured data across the organization. This role requires technical expertise in Python, SQL, ETL/ELT frameworks, and cloud data warehouses, along with strong collaboration skills to partner with cross-functional teams.


Company: BigThinkCode Technologies

URL:

Location: Chennai (Work from office / Hybrid)

Experience: 4 - 6 years


Key Responsibilities:

  • Design, develop, and maintain scalable ETL/ELT pipelines to process structured and unstructured data.
  • Optimize and manage SQL queries for performance and efficiency in large-scale datasets.
  • Experience working with data warehouse solutions (e.g., Redshift, BigQuery, Snowflake) for analytics and reporting.
  • Collaborate with data scientists, analysts, and business stakeholders to translate requirements into technical solutions.
  • Experience in Implementing solutions for streaming data (e.g., Apache Kafka, AWS Kinesis) is preferred but not mandatory.
  • Ensure data quality, governance, and security across pipelines and storage systems.
  • Document architectures, processes, and workflows for clarity and reproducibility.


Required Technical Skills:

  • Proficiency in Python for scripting, automation, and pipeline development.
  • Expertise in SQL (complex queries, optimization, and database design).
  • Hands-on experience with ETL/ELT tools (e.g., Apache Airflow, dbt, AWS Glue).
  • Experience working with structured data (RDBMS) and unstructured data (JSON, Parquet, Avro).
  • Familiarity with cloud-based data warehouses (Redshift, BigQuery, Snowflake).
  • Knowledge of version control systems (e.g., Git) and CI/CD practices.


Preferred Qualifications:

  • Experience with streaming data technologies (e.g., Kafka, Kinesis, Spark Streaming).
  • Exposure to cloud platforms (AWS, GCP, Azure) and their data services.
  • Understanding of data modelling (dimensional, star schema) and optimization techniques.


Soft Skills:

  • Team player with a collaborative mindset and ability to mentor junior engineers.
  • Strong stakeholder management skills to align technical solutions with business goals.
  • Excellent communication skills to explain technical concepts to non-technical audiences.
  • Proactive problem-solving and adaptability in fast-paced environments.


If interested, apply / reply by sharing your updated profile to connect and discuss.


Regards

Read more
DataToBiz Pvt. Ltd.

at DataToBiz Pvt. Ltd.

2 recruiters
Vibhanshi Bakliwal
Posted by Vibhanshi Bakliwal
Pune
8 - 12 yrs
₹15L - ₹18L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

We are seeking a highly skilled and experienced Power BI Lead / Architect to join our growing team. The ideal candidate will have a strong understanding of data warehousing, data modeling, and business intelligence best practices. This role will be responsible for leading the design, development, and implementation of complex Power BI solutions that provide actionable insights to key stakeholders across the organization.


Location - Pune (Hybrid 3 days)


Responsibilities:


Lead the design, development, and implementation of complex Power BI dashboards, reports, and visualizations.

Develop and maintain data models (star schema, snowflake schema) for optimal data analysis and reporting.

Perform data analysis, data cleansing, and data transformation using SQL and other ETL tools.

Collaborate with business stakeholders to understand their data needs and translate them into effective and insightful reports.

Develop and maintain data pipelines and ETL processes to ensure data accuracy and consistency.

Troubleshoot and resolve technical issues related to Power BI dashboards and reports.

Provide technical guidance and mentorship to junior team members.

Stay abreast of the latest trends and technologies in the Power BI ecosystem.

Ensure data security, governance, and compliance with industry best practices.

Contribute to the development and improvement of the organization's data and analytics strategy.

May lead and mentor a team of junior Power BI developers.


Qualifications:


8-12 years of experience in Business Intelligence and Data Analytics.

Proven expertise in Power BI development, including DAX, advanced data modeling techniques.

Strong SQL skills, including writing complex queries, stored procedures, and views.

Experience with ETL/ELT processes and tools.

Experience with data warehousing concepts and methodologies.

Excellent analytical, problem-solving, and communication skills.

Strong teamwork and collaboration skills.

Ability to work independently and proactively.

Bachelor's degree in Computer Science, Information Systems, or a related field preferred.

Read more
VyTCDC
Gobinath Sundaram
Posted by Gobinath Sundaram
Bengaluru (Bangalore)
3 - 8 yrs
₹5L - ₹20L / yr
ETL QA
Data Warehouse (DWH)
SQL
Data modeling
Data migration
  • Extract Transform Load (ETL) and ETL Tools skills
  • Data Modeling and Data Integration expertise
  • Data Warehousing knowledge
  • Experience in working with SQL databases
  • Strong analytical and problem-solving abilities
  • Excellent communication and interpersonal skills
  • Bachelor's degree in Computer Science, Information Systems, or related field
  • Relevant certifications in ETL Testing or Data Warehousing


Read more
OnActive
Mansi Gupta
Posted by Mansi Gupta
Remote only
8 - 12 yrs
₹20L - ₹25L / yr
PowerBI
DAX
Data modeling
SQL
skill iconPython

Senior Data Analyst


Experience: 8+ Years

Work Mode: Remote Full Time


Responsibilities:

• Analyze large datasets to uncover trends, patterns, and insights to support business goals.

• Design, develop, and manage interactive dashboards and reports using Power BI.

• Utilize DAX and SQL for advanced data querying and data modeling.

• Create and manage complex SQL queries for data extraction, transformation, and loading processes.

• Collaborate with cross-functional teams to understand data requirements and translate them into actionable solutions.

• Maintain data accuracy and integrity across projects, ensuring reliable data-driven insights.

• Present findings to stakeholders, translating complex data insights into simple, actionable business recommendations.


Skills:

Power BI, DAX (Data Analysis Expressions), SQL, Data Modeling, Python


Preferred Skills:

• Machine Learning: Exposure to machine learning models and their integration within analytical solutions.

• Microsoft Fabric: Familiarity with Microsoft Fabric for enhanced data integration and management.

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore)
3 - 5 yrs
Best in industry
SQL
skill iconAmazon Web Services (AWS)
skill iconPython
Data Warehouse (DWH)
Informatica
+5 more

Job Description for Data Engineer Role:- 

Must have:

Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.

Experience working with Databases and SQL optimizations

Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.

Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward. 

 

Role and Responsibilities:

  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
  • Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
  • Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
  • Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards

Qualifications -

  • Experience with SQL and NoSQL databases.
  • Experience with cloud platforms, preferably AWS.
  • Strong experience with data warehousing and data lake technologies (Snowflake)
  • Expertise in data modelling
  • Experience with ETL/LT tools and methodologies
  • Experience working on real-time Data Streaming and Data Streaming platforms
  • 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
  • 2+ years working with SQL and NoSQL databases, data modeling and data management
  • 2+ years of experience with AWS, GCP, Azure, or another cloud service.


Read more
Hyderabad
3 - 6 yrs
₹10L - ₹16L / yr
SQL
Spark
Analytical Skills
Hadoop
Communication Skills
+4 more

The Sr. Analytics Engineer would provide technical expertise in needs identification, data modeling, data movement, and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective.


Understands and leverages best-fit technologies (e.g., traditional star schema structures, cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges.


Provides data understanding and coordinates data-related activities with other data management groups such as master data management, data governance, and metadata management.


Actively participates with other consultants in problem-solving and approach development.


Responsibilities :


Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs.


Perform data analysis to validate data models and to confirm the ability to meet business needs.


Assist with and support setting the data architecture direction, ensuring data architecture deliverables are developed, ensuring compliance to standards and guidelines, implementing the data architecture, and supporting technical developers at a project or business unit level.


Coordinate and consult with the Data Architect, project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels.


Work closely with Business Analysts and Solution Architects to design the data model satisfying the business needs and adhering to Enterprise Architecture.


Coordinate with Data Architects, Program Managers and participate in recurring meetings.


Help and mentor team members to understand the data model and subject areas.


Ensure that the team adheres to best practices and guidelines.


Requirements :


- Strong working knowledge of at least 3 years of Spark, Java/Scala/Pyspark, Kafka, Git, Unix / Linux, and ETL pipeline designing.


- Experience with Spark optimization/tuning/resource allocations


- Excellent understanding of IN memory distributed computing frameworks like Spark and its parameter tuning, writing optimized workflow sequences.


- Experience of relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., Redshift, Bigquery, Cassandra, etc).


- Familiarity with Docker, Kubernetes, Azure Data Lake/Blob storage, AWS S3, Google Cloud storage, etc.


- Have a deep understanding of the various stacks and components of the Big Data ecosystem.


- Hands-on experience with Python is a huge plus

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Vijayalakshmi Selvaraj
Posted by Vijayalakshmi Selvaraj
Bengaluru (Bangalore), Pune, Mumbai
10 - 22 yrs
Best in industry
DynamoDB
AWS
EMR
Data migration
Data modeling

Responsibilities:

 

  • Design, implement, and maintain scalable and reliable database solutions on the AWS platform.
  • Architect, deploy, and optimize DynamoDB databases for performance, scalability, and cost-efficiency.
  • Configure and manage AWS OpenSearch (formerly Amazon Elasticsearch Service) clusters for real-time search and analytics capabilities.
  • Design and implement data processing and analytics solutions using AWS EMR (Elastic MapReduce) for large-scale data processing tasks.
  • Collaborate with cross-functional teams to gather requirements, design database solutions, and implement best practices.
  • Perform performance tuning, monitoring, and troubleshooting of database systems to ensure high availability and performance.
  • Develop and maintain documentation, including architecture diagrams, configurations, and operational procedures.
  • Stay current with the latest AWS services, database technologies, and industry trends to provide recommendations for continuous improvement.
  • Participate in the evaluation and selection of new technologies, tools, and frameworks to enhance database capabilities.
  • Provide guidance and mentorship to junior team members, fostering knowledge sharing and skill development.

 

Requirements:

 

  • Bachelor’s degree in computer science, Information Technology, or related field.
  • Proven experience as an AWS Architect or similar role, with a focus on database technologies.
  • Hands-on experience designing, implementing, and optimizing DynamoDB databases in production environments.
  • In-depth knowledge of AWS OpenSearch (Elasticsearch) and experience configuring and managing clusters for search and analytics use cases.
  • Proficiency in working with AWS EMR (Elastic MapReduce) for big data processing and analytics.
  • Strong understanding of database concepts, data modelling, indexing, and query optimization.
  • Experience with AWS services such as S3, EC2, RDS, Redshift, Lambda, and CloudFormation.
  • Excellent problem-solving skills and the ability to troubleshoot complex database issues.
  • Solid understanding of cloud security best practices and experience implementing security controls in AWS environments.
  • Strong communication and collaboration skills with the ability to work effectively in a team environment.
  • AWS certifications such as AWS Certified Solutions Architect, AWS Certified Database - Specialty, or equivalent certifications are a plus.


Read more
Fictiv

at Fictiv

1 video
7 recruiters
Margaret Moses
Posted by Margaret Moses
Pune
5 - 12 yrs
Best in industry
Salesforce Apex
Salesforce Visualforce
Salesforce Lightning
Salesforce development
Data modeling

What’s in it for you?

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Opportunity To Grow Your Career

At Fictiv, you'll be surrounded by supportive teammates who will push you to be your best through their curiosity and passion.

 

Opportunity To Unlock Your Creativity

Think of all the times you were held back from trying new ideas because you were boxed in by bureaucratic legacy processes or old school tactics. Having a growth mindset is deeply ingrained into our company culture since day 1 so Fictiv is an environment where you have the creative liberty and support of the team to try big bold ideas to achieve our sales and customer goals.

Impact In This Role

Excellent problem solving, decision-making and critical thinking skills.

Collaborative, a team player.

Excellent verbal and written communication skills.

Exhibits initiative, integrity and empathy.

Enjoy working with a diverse group of people in multiple regions.

Comfortable not knowing answers, but resourceful and able to resolve issues.

Self-starter; comfortable with ambiguity, asking questions and constantly learning.

Customer service mentality; advocates for another person's point of view.

Methodical and thorough in written documentation and communication.

Culture oriented; wants to work with people rather than in isolation.

You will report to the Director of IT Engineering

What You’ll Be Doing

  • Interface with Business Analysts and Stakeholders to understand & clarify requirements
  • Develop technical design for solutions Development
  • Implement high quality, scalable solutions following best practices, including configuration and code.
  • Deploy solutions and code using automated deployment tools
  • Take ownership of technical deliverables, ensure that quality work is completed, fully tested, delivered on time.
  • Conduct code reviews, optimization, and refactoring to minimize technical debt within Salesforce implementations.
  • Collaborate with cross-functional teams to integrate Salesforce with other systems and platforms, ensuring seamless data flow and system interoperability.
  • Identify opportunities for process improvements, mentor and support other developers/team members as needed.
  • Stay updated on new Salesforce features and functionalities and provide recommendations for process improvements.


Desired Traits

  • 8-10 years of experience in Salesforce development
  • Proven experience in developing Salesforce solutions with a deep understanding of Apex, Visualforce, Lightning Web Components, and Salesforce APIs.
  • Have worked in Salesforce CPQ, Sales/Manufacturing Cloud, Case Management
  • Experienced in designing and implementing custom solutions that align with business needs.
  • Strong knowledge of Salesforce data modeling, reporting, and database design.
  • Demonstrated experience in building and maintaining integrations between Salesforce and external applications.
  • Strong unit testing, functional testing and debugging skills
  • Strong understanding of best practices
  • Active Salesforce Certifications are desirable.
  • Experience in Mulesoft is a plus
  • Excellent communication skills and the ability to translate complex technical requirements into actionable solutions.

Interested in learning more? We look forward to hearing from you soon.

Read more
PloPdo
Chandan Nadkarni
Posted by Chandan Nadkarni
Hyderabad
3 - 12 yrs
₹22L - ₹25L / yr
Cassandra
Data modeling

Responsibilities -

  • Collaborate with the development team to understand data requirements and identify potential scalability issues.
  • Design, develop, and implement scalable data pipelines and ETL processes to ingest, process, and analyse large - volumes of data from various sources.
  • Optimize data models and database schemas to improve query performance and reduce latency.
  • Monitor and troubleshoot the performance of our Cassandra database on Azure Cosmos DB, identifying bottlenecks and implementing optimizations as needed.
  • Work with cross-functional teams to ensure data quality, integrity, and security.
  • Stay up to date with emerging technologies and best practices in data engineering and distributed systems.


Qualifications & Requirements -

  • Proven experience as a Data Engineer or similar role, with a focus on designing and optimizing large-scale data systems.
  • Strong proficiency in working with NoSQL databases, particularly Cassandra.
  • Experience with cloud-based data platforms, preferably Azure Cosmos DB.
  • Solid understanding of Distributed Systems, Data modelling, Data Warehouse Designing, and ETL Processes.
  • Detailed understanding of Software Development Life Cycle (SDLC) is required.
  • Good to have knowledge on any visualization tool like Power BI, Tableau.
  • Good to have knowledge on SAP landscape (SAP ECC, SLT, BW, HANA etc).
  • Good to have experience on Data Migration Project.
  • Knowledge of Supply Chain domain would be a plus.
  • Familiarity with software architecture (data structures, data schemas, etc.)
  • Familiarity with Python programming language is a plus.
  • The ability to work in a dynamic, fast-paced, work environment.
  • A passion for data and information with strong analytical, problem solving, and organizational skills.
  • Self-motivated with the ability to work under minimal direction.
  • Strong communication and collaboration skills, with the ability to work effectively in a cross-functional team environment.


Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Bengaluru (Bangalore)
10 - 16 yrs
₹10L - ₹15L / yr
databricks
Data modeling
SQL
skill iconPython
AWS Lambda
+2 more

Title:- Lead Data Engineer 


Experience: 10+y

Budget: 32-36 LPA

Location: Bangalore 

Work of Mode: Work from office

Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS

Qualification: Any Engineering degree


Roles and Responsibilities:


• 8 - 10+ years’ experience in developing scalable Big Data applications or solutions on

 distributed platforms.

• Able to partner with others in solving complex problems by taking a broad

 perspective to identify.

• innovative solutions.

• Strong skills building positive relationships across Product and Engineering.

• Able to influence and communicate effectively, both verbally and written, with team

  members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks.

• Experience working in Agile and Scrum development process.

• Experience working in a fast-paced, results-oriented environment.

• Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2,

  IAM etc.

• Experience working with Data Warehousing tools, including SQL database, Presto,

  and Snowflake

• Experience architecting data product in Streaming, Serverless and Microservices

  Architecture and platform.

• Experience working with Data platforms, including EMR, Airflow, Databricks (Data

  Engineering & Delta

• Lake components, and Lakehouse Medallion architecture), etc.

• Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for

  Managed Spark jobs, build Docker images, etc.

• Experience working with distributed technology tools, including Spark, Python, Scala

• Working knowledge of Data warehousing, Data modelling, Governance and Data

  Architecture

• Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite

  etc.

• Demonstrated experience in learning new technologies and skills.

• Bachelor’s degree in computer science, Information Systems, Business, or other

  relevant subject area

Read more
Insurance Org

Insurance Org

Agency job
via InvokHR by Jessica Chen
Gurugram
3 - 8 yrs
₹10L - ₹15L / yr
Guidewire
GOSU
API
User Interface (UI) Development
Data modeling
+2 more

Role Title: Developer - Guidewire Integration-Config

 

 

Role Purpose

We are looking for a Developer for our Claims Guidewire team, who is a technology enthusiast, and eager to be part of a culture of modern software engineering practices, continuous improvement, and innovation.

 

As a Developer, you will be part of a dynamic engineering team and work on development, maintenance, and transformation of our strategic Claims Guidewire platform. You will learn about software applications, technology stack, ways of working and standards.

 

 

Key Accountabilities

 

·        Deliver software development tasks for Claims Guidewire applications, in the areas of Integration and Configuration, with expected quality measures and timeframe, e.g., coding, writing unit test cases (G-Unit) and unit testing, debugging and defect fixing, providing test support, providing release support.

·        Communicate with technical leads and IT groups for understanding the project’s technical implications, dependencies, and potential conflicts.

·        Research issues reported in Production, perform root cause analysis and document them, respond to and resolve technical issues in a timely manner.

·        Perform versioning of the release updates and resolve the code conflicts while merging and promoting the code to higher environments.

·        Develop their technical and functional knowledge on Claims Digital Guidewire platform.

·        Understand and follow Guidewire’s cloud standards for application development.

·        Active participation in team meetings like daily stand-ups, risk forums, planning sessions and retrospectives.



Skills & Experience

·        3+ years of development experience on Guidewire cloud platform and applications, Guidewire certification preferred.

·        Hands on development expertise on Guidewire ClaimCentre with configuration and integration

·        Experience in Guidewire platform (Gosu scripting / Edge APIs / UI / Data Model)

·        Should have knowledge on Admin data loading, Assignment and Segmentation Rules, Pre-update and Validation rules, Authority limits, Financials (checks, reserves, recoveries …)

·        Good experience on LOB configuration and related type-lists

·        Good experience on integration components including plug-ins, messaging (and supporting business rules), batches, REST APIs and programs that call the Guidewire application APIs.

·        Experience on any database Oracle / SQL Server and well versed in SQL.

·        Experience of working in a CI/CD setup and related tools/technologies

·        Insurance domain knowledge with Property & Casualty background preferred.

 


Location- Gurugram

CTC- Upto 25lpa


Read more
PGP Glass Pvt Ltd
Animesh Srivastava
Posted by Animesh Srivastava
Vadodara
1 - 4 yrs
₹6L - ₹13L / yr
Data modeling
skill iconMachine Learning (ML)

Key Roles/Responsibilities: –

• Develop an understanding of business obstacles, create

• solutions based on advanced analytics and draw implications for

• model development

• Combine, explore and draw insights from data. Often large and

• complex data assets from different parts of the business.

• Design and build explorative, predictive- or prescriptive

• models, utilizing optimization, simulation and machine learning

• techniques

• Prototype and pilot new solutions and be a part of the aim

• of ‘productifying’ those valuable solutions that can have impact at a

• global scale

• Guides and coaches other chapter colleagues to help solve

• data/technical problems at an operational level, and in

• methodologies to help improve development processes

• Identifies and interprets trends and patterns in complex data sets to

• enable the business to take data-driven decisions

Read more
This opening is with an MNC

This opening is with an MNC

Agency job
via LK Consultants by Namita Agate
Mumbai, Malad, andheri
8 - 13 yrs
₹13L - ₹22L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+8 more

Minimum of 8 years of experience of which, 4 years should be of applied data mining

experience in disciplines such as Call Centre Metrics.

 Strong experience in advanced statistics and analytics including segmentation, modelling, regression, forecasting etc.

 Experience with leading and managing large teams.

 Demonstrated pattern of success in using advanced quantitative analytic methods to solve business problems.

 Demonstrated experience with Business Intelligence/Data Mining tools to work with

data, investigate anomalies, construct data sets, and build models.

 Critical to share details on projects undertaken (preferably on telecom industry)

specifically through analysis from CRM.

Read more
hopscotch
Bengaluru (Bangalore)
5 - 8 yrs
₹6L - ₹15L / yr
skill iconPython
Amazon Redshift
skill iconAmazon Web Services (AWS)
PySpark
Data engineering
+3 more

About the role:

 Hopscotch is looking for a passionate Data Engineer to join our team. You will work closely with other teams like data analytics, marketing, data science and individual product teams to specify, validate, prototype, scale, and deploy data pipelines features and data architecture.


Here’s what will be expected out of you:

➢ Ability to work in a fast-paced startup mindset. Should be able to manage all aspects of data extraction transfer and load activities.

➢ Develop data pipelines that make data available across platforms.

➢ Should be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platform.

➢ Work on various aspects of the AI/ML ecosystem – data modeling, data and ML pipelines.

➢ Work closely with Devops and senior Architect to come up with scalable system and model architectures for enabling real-time and batch services.


What we want:

➢ 5+ years of experience as a data engineer or data scientist with a focus on data engineering and ETL jobs.

➢ Well versed with the concept of Data warehousing, Data Modelling and/or Data Analysis.

➢ Experience using & building pipelines and performing ETL with industry-standard best practices on Redshift (more than 2+ years).

➢ Ability to troubleshoot and solve performance issues with data ingestion, data processing & query execution on Redshift.

➢ Good understanding of orchestration tools like Airflow.

 ➢ Strong Python and SQL coding skills.

➢ Strong Experience in distributed systems like spark.

➢ Experience with AWS Data and ML Technologies (AWS Glue,MWAA, Data Pipeline,EMR,Athena, Redshift,Lambda etc).

➢ Solid hands on with various data extraction techniques like CDC or Time/batch based and the related tools (Debezium, AWS DMS, Kafka Connect, etc) for near real time and batch data extraction.


Note :

Product based companies, Ecommerce companies is added advantage

Read more
AI Domain US Based Product Based Company

AI Domain US Based Product Based Company

Agency job
via New Era India by Asha P
Bengaluru (Bangalore)
3 - 10 yrs
₹30L - ₹50L / yr
Data engineering
Data modeling
skill iconPython

Requirements:

  • 2+ years of experience (4+ for Senior Data Engineer) with system/data integration, development or implementation of enterprise and/or cloud software Engineering degree in Computer Science, Engineering or related field.
  • Extensive hands-on experience with data integration/EAI technologies (File, API, Queues, Streams), ETL Tools and building custom data pipelines.
  • Demonstrated proficiency with Python, JavaScript and/or Java
  • Familiarity with version control/SCM is a must (experience with git is a plus).
  • Experience with relational and NoSQL databases (any vendor) Solid understanding of cloud computing concepts.
  • Strong organisational and troubleshooting skills with attention to detail.
  • Strong analytical ability, judgment and problem-solving techniques Interpersonal and communication skills with the ability to work effectively in a cross functional team.


Read more
Lifespark Technologies

at Lifespark Technologies

6 candid answers
1 video
Amey Desai
Posted by Amey Desai
Mumbai
1 - 3 yrs
₹4L - ₹9L / yr
TensorFlow
skill iconMachine Learning (ML)
Computer Vision
skill iconDeep Learning
Time series
+4 more

Lifespark is looking for individuals with a passion for impacting real lives through technology. Lifespark is one of the most promising startups in the Assistive Tech space in India, and has been honoured with several National and International awards. Our mission is to create seamless, persistent and affordable healthcare solutions. If you are someone who is driven to make a real impact in this world, we are your people.

Lifespark is currently building solutions for Parkinson’s Disease, and we are looking for a ML lead to join our growing team. You will be working directly with the founders on high impact problems in the Neurology domain. You will be solving some of the most fundamental and exciting challenges in the industry and will have the ability to see your insights turned into real products every day

 

Essential experience and requirements:

1. Advanced knowledge in the domains of computer vision, deep learning

2. Solid understand of Statistical / Computational concepts like Hypothesis Testing, Statistical Inference, Design of Experiments and production level ML system design

3. Experienced with proper project workflow

4. Good at collating multiple datasets (potentially from different sources)

5. Good understanding of setting up production level data pipelines

6. Ability to independently develop and deploy ML systems to various platforms (local and cloud)

7. Fundamentally strong with time-series data analysis, cleaning, featurization and visualisation

8. Fundamental understanding of model and system explainability

9. Proactive at constantly unlearning and relearning

10. Documentation ninja - can understand others documentation as well as create good documentation

 

Responsibilities :

1. Develop and deploy ML based systems built upon healthcare data in the Neurological domain

2. Maintain deployed systems and upgrade them through online learning

3. Develop and deploy advanced online data pipelines

Read more
Molecular Connections

at Molecular Connections

4 recruiters
Molecular Connections
Posted by Molecular Connections
Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
Foxit Software
Remote only
5 - 12 yrs
₹22L - ₹35L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Web application security
+4 more

Experience Required: 5 -10 yrs.

Job location: Sec-62, Noida

Work from office (Hybrid)


Development Platform: Backend Development- Java/J2EE, Struts, Spring, MySQL, OWASP


Job Brief:

Requirements:

·      5+ years of experience in developing distributed, multi-tier enterprise applications, APIs.

·      Fully participated in several major product development cycles.

·      Solid background in design, OOP, object, and data modelling.

·      Deep working knowledge of Java, Struts,Spring, Relational Database.

·      Experience in design and implementation of service interface and public APIs.

·      Actively involved/writing codes in current project.

·      Development knowledge and experience of working with AWS, Azure etc. will be an added plus.

·      Clear Understanding and Hands on experience on OWASP Top 10 Vulnerability standards like XSS, CSRF, SQL injection, session hijacking, and authorization bypass vulnerabilities.

·      Find and resolve the security concerns on the product/application.

·      Good Documentation, reporting, Strong communication, and collaboration skills with various levels of executives from top management to technical team members across the organization.

·      Strong self-starter who can operate independently.

Read more
Personal Care Product Manufacturing

Personal Care Product Manufacturing

Agency job
via Qrata by Rayal Rajan
Mumbai
3 - 8 yrs
₹12L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

DATA ENGINEER


Overview

They started with a singular belief - what is beautiful cannot and should not be defined in marketing meetings. It's defined by the regular people like us, our sisters, our next-door neighbours, and the friends we make on the playground and in lecture halls. That's why we stand for people-proving everything we do. From the inception of a product idea to testing the final formulations before launch, our consumers are a part of each and every process. They guide and inspire us by sharing their stories with us. They tell us not only about the product they need and the skincare issues they face but also the tales of their struggles, dreams and triumphs. Skincare goes deeper than skin. It's a form of self-care for many. Wherever someone is on this journey, we want to cheer them on through the products we make, the content we create and the conversations we have. What we wish to build is more than a brand. We want to build a community that grows and glows together - cheering each other on, sharing knowledge, and ensuring people always have access to skincare that really works.

 

Job Description:

We are seeking a skilled and motivated Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, developing, and maintaining the data infrastructure and systems that enable efficient data collection, storage, processing, and analysis. You will collaborate with cross-functional teams, including data scientists, analysts, and software engineers, to implement data pipelines and ensure the availability, reliability, and scalability of our data platform.


Responsibilities:

Design and implement scalable and robust data pipelines to collect, process, and store data from various sources.

Develop and maintain data warehouse and ETL (Extract, Transform, Load) processes for data integration and transformation.

Optimize and tune the performance of data systems to ensure efficient data processing and analysis.

Collaborate with data scientists and analysts to understand data requirements and implement solutions for data modeling and analysis.

Identify and resolve data quality issues, ensuring data accuracy, consistency, and completeness.

Implement and maintain data governance and security measures to protect sensitive data.

Monitor and troubleshoot data infrastructure, perform root cause analysis, and implement necessary fixes.

Stay up-to-date with emerging technologies and industry trends in data engineering and recommend their adoption when appropriate.


Qualifications:

Bachelor’s or higher degree in Computer Science, Information Systems, or a related field.

Proven experience as a Data Engineer or similar role, working with large-scale data processing and storage systems.

Strong programming skills in languages such as Python, Java, or Scala.

Experience with big data technologies and frameworks like Hadoop, Spark, or Kafka.

Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, or Oracle).

Familiarity with cloud platforms like AWS, Azure, or GCP, and their data services (e.g., S3, Redshift, BigQuery).

Solid understanding of data modeling, data warehousing, and ETL principles.

Knowledge of data integration techniques and tools (e.g., Apache Nifi, Talend, or Informatica).

Strong problem-solving and analytical skills, with the ability to handle complex data challenges.

Excellent communication and collaboration skills to work effectively in a team environment.


Preferred Qualifications:

Advanced knowledge of distributed computing and parallel processing.

Experience with real-time data processing and streaming technologies (e.g., Apache Kafka, Apache Flink).

Familiarity with machine learning concepts and frameworks (e.g., TensorFlow, PyTorch).

Knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).

Experience with data visualization and reporting tools (e.g., Tableau, Power BI).

Certification in relevant technologies or data engineering disciplines.



Read more
Vithamas Technologies Pvt LTD
Mysore
4 - 6 yrs
₹10L - ₹20L / yr
Data modeling
ETL
Oracle
MS SQLServer
skill iconMongoDB
+4 more

RequiredSkills:


• Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience inExtraction, Transformation and Loading ETLwork using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developerinOracle, MS SQLor another enterprise database with a focus on building data integration process • Candidate should haveanyNoSqltechnology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with BigDataplatforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding of data warehousing concepts and decision support systems.


• Ability to deal with sensitive and confidential material and adhere to worldwide data security and • Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skills.

Read more
Porter.in

at Porter.in

1 recruiter
Agency job
via UPhill HR by Ingit Pandey
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹28L / yr
skill iconPython
SQL
Data Visualization
Data modeling
Predictive modelling
+1 more

Responsibilities

This role requires a person to support business charters & accompanying products by aligning with the Analytics

Manager’s vision, understanding tactical requirements and helping in successful execution. Split would be approx.

70% management + 30% individual contributor. Responsibilities include


Project Management

- Understand business needs and objectives.

- Refine use cases and plan iterations and deliverables - able to pivot as required.

- Estimate efforts and conduct regular task updates to ensure timeline adherence.

- Set and manage stakeholder expectations as required


Quality Execution

- Help BA and SBA resources with requirement gathering and final presentations.

- Resolve blockers regarding technical challenges and decision-making.

- Check final deliverables for correctness and review codes, along with Manager.


KPIs and metrics

- Orchestrate metrics building, maintenance, and performance monitoring.

- Owns and manages data models, data sources, and data definition repo.

- Makes low-level design choices during execution.


Team Nurturing

- Help Analytics Manager during regular one-on-ones + check-ins + recruitment.

- Provide technical guidance whenever required.

- Improve benchmarking and decision-making skills at execution-level.

- Train and get new resources up-to-speed.

- Knowledge building (methodologies) to better position the team for complex problems.


Communication

- Upstream to document and discuss execution challenges, process inefficiencies, and feedback loops.

- Downstream and parallel for context-building, mentoring, stakeholder management.


Analytics Stack

- Analytics : Python / R + SQL + Excel / PPT, Colab notebooks

- Database : PostgreSQL, Amazon Redshift, DynamoDB, Aerospike

- Warehouse : Amazon Redshift

- ETL : Lots of Python + custom-made

- Business Intelligence / Visualization : Metabase + Python/R libraries (location data)

- Deployment pipeline : Docker, Git, Jenkins, AWS Lambda

Read more
Expand My Business
Remote only
5 - 10 yrs
₹15L - ₹25L / yr
skill iconAmazon Web Services (AWS)
Microservices
Data modeling
skill iconPostgreSQL
MySQL
+13 more

 

Roles and Responsibilities:

 

Perform detailed feature requirements analysis along with a team of Senior Developers, 

define system functionality, work on system design and document the same

● Design/Develop/Improve Cogno AI’s backend infrastructure and stack and build fault￾tolerant, scalable and real-time distributed system

● Own the design, development and deployment of code to improve product and platform 

functionality

● Taking initiative and giving ideas for improving the processes in the technology team 

would lead to better performance of the team and result in robust solutions

● Writing high-performance, reliable and maintainable code

● Support team with timely analysis and debugging of operational issues

● Emphasis on automation and scripting

● Cross-functional communication to deliver projects

● Mentor junior team members technically and manage a team of software engineers

● Taking interviews and making tests for hiring people in the technology team

 

What do we look for?

 

The following are the important eligibility requirements for this Job:

● Bachelor's or Master's degree in computer science or equivalent.

● 5+ years of experience working as a software engineer, preferably in a product-based 

company.

● Experience working with major cloud solutions AWS (preferred), Azure, and GCP.

● Familiarity with 3-Tier, microservices architecture and distributed systems

● Experience with the design & development of RESTful services

● Experience with developing Linux-based applications, networking and scripting.

● Experience with different data stores, data modelling and scaling them

● Familiarity with data stores such as PostgreSQL, MySQL, Mongo-DB etc.

● 4+ years of experience with web frameworks (preferably Django, Flask etc.)

● Good understanding of data structures, multi-threading and concurrency concepts.

● Experience with DevOps tools like Jenkins, Ansible, Kubernetes, and Git is a plus.

● Familiarity with elastic search queries and visualization tools like grafana, kibana

● Strong networking fundamentals: Firewalls, Proxies, DNS, Load Balancing, etc.

● Strong analytical and problem-solving skills.

● Excellent written and verbal communication skills.

● Team player, flexible and able to work in a fast-paced environment.

● End-to-end ownership of the product. You own what you develop.

Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
8 - 12 yrs
₹45L - ₹65L / yr
skill iconJavascript
SQL
NOSQL Databases
skill iconNodeJS (Node.js)
skill iconReact Native
+8 more

What is the role?

Expected to manage the product plan, engineering, and delivery of Xoxoday Plum. Plum is a rewarding and incentives infrastructure for businesses. It's a unified integrated suite of products to handle various rewarding use cases for consumers, sales, channel partners, and employees. 31% of the total tech team is aligned towards this product and comprises of 32 members within Plum Tech, Quality, Design, and Product management. The annual FY 2019-20 revenue for Plum was $ 40MN and is showing high growth potential this year as well. The product has a good mix of both domestic and international clientele and is expanding. The role will be based out of our head office in Bangalore, Karnataka however we are open to discuss the option of remote working with 25 - 50% travel.

Key Responsibilities

  • Scope and lead technology with the right product and business metrics.
  • Directly contribute to product development by writing code if required.
  • Architect systems for scale and stability.
  • Serve as a role model for our high engineering standards and bring consistency to the many codebases and processes you will encounter.
  • Collaborate with stakeholders across disciplines like sales, customers, product, design, and customer success.
  • Code reviews and feedback.
  • Build simple solutions and designs over complex ones, and have a good intuition for what is lasting and scalable.
  • Define a process for maintaining a healthy engineering culture ( Cadence for one-on-ones, meeting structures, HLDs, Best Practices In development, etc).

What are we looking for?

  • Manage a senior tech team of more than 5 direct and 25 indirect developers.
  • Should have experience in handling e-commerce applications at scale.
  • Should have at least 7+ years of experience in software development, agile processes for international e-commerce businesses.
  • Should be extremely hands-on, full-stack developer with modern architecture.
  • Should exhibit skills to build a good engineering team and culture.
  • Should be able to handle the chaos with product planning, prioritizing, customer-first approach.
  • Technical proficiency
  • JavaScript, SQL, NoSQL, PHP
  • Frameworks like React, ReactNative, Node.js, GraphQL
  • Databases technologies like ElasticSearch, Redis, MySql, Cassandra, MongoDB, Kafka
  • Dev ops to manage and architect infra - AWS, CI/CD (Jenkins)
  • System Architecture w.r.t Microservices, Cloud Development, DB Administration, Data Modeling
  • Understanding of security principles and possible attacks and mitigate them.

Whom will you work with?

You will lead the Plum Engineering team and work in close conjunction with the Tech leads of Plum with some cross-functional stake with other products. You'll report to the co-founder directly.

What can you look for?

A wholesome opportunity in a fast-paced environment with scale, international flavour, backend, and frontend. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore, and Dublin. We have three products in our portfolio: Plum, Empuls, and Compass. Xoxoday works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners, or consumers for better business results.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

Read more
Pune
3 - 5 yrs
₹20L - ₹30L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
CI/CD
+12 more

As an engineer, you will help with the implementation, and launch of many key product features. You will get an opportunity to work on a wide range of technologies (including Spring, AWS Elastic Search, Lambda, ECS, Redis, Spark, Kafka etc.) and apply new technologies for solving problems. You will have an influence on defining product features, drive operational excellence, and spearhead the best practices that enable a quality product. You will get to work with skilled and motivated engineers who are already contributing to building high-scale and high-available systems.

If you are looking for an opportunity to work on leading technologies and would like to build product technology that can cater millions of customers inclined towards providing them the best experience, and relish large ownership and diverse technologies, join our team today!

 

What You'll Do:

  • Creating detailed design, working on development and performing code reviews.
  • Implementing validation and support activities in line with architecture requirements
  • Help the team translate the business requirements into R&D tasks and manage the roadmap of the R&D tasks.
  • Designing, building, and implementation of the product; participating in requirements elicitation, validation of architecture, creation and review of high and low level design, assigning and reviewing tasks for product implementation.
  • Work closely with product managers, UX designers and end users and integrating software components into a fully functional system
  • Ownership of product/feature end-to-end for all phases from the development to the production.
  • Ensuring the developed features are scalable and highly available with no quality concerns.
  • Work closely with senior engineers for refining the and implementation.
  • Management and execution against project plans and delivery commitments.
  • Assist directly and indirectly in the continual hiring and development of technical talent.
  • Create and execute appropriate quality plans, project plans, test strategies and processes for development activities in concert with business and project management efforts.

The ideal candidate is a passionate engineer about delivering experiences that delight customers and creating solutions that are robust. He/she should be able to commit and own the deliveries end-to-end.

 

 

What You'll Need:

 

  • A Bachelor's degree in Computer Science or related technical discipline.
  • 2-3+ years of Software Development experience with proficiency in Java or equivalent object-oriented languages, coupled with design and SOA
  • Fluency with Java, and Spring is good.
  • Experience in JEE applications and frameworks like struts, spring, mybatis, maven, gradle
  • Strong knowledge of Data Structures, Algorithms and CS fundamentals.
  • Experience in at least one shell scripting language, SQL, SQL Server, PostgreSQL and data modeling skills
  • Excellent analytical and reasoning skills
  • Ability to learn new domains and deliver output
  • Hands on Experience with the core AWS services
  • Experience working with CI/CD tools (Jenkins, Spinnaker, Nexus, GitLab, TeamCity, GoCD, etc.)

 

  • Expertise in at least one of the following:

    - Kafka, ZeroMQ, AWS SNS/SQS, or equivalent streaming technology

    - Distributed cache/in memory data grids like Redis, Hazelcast, Ignite, or Memcached

    - Distributed column store databases like Snowflake, Cassandra, or HBase

    - Spark, Flink, Beam, or equivalent streaming data processing frameworks

  • Proficient with writing and reviewing Python and other object-oriented language(s) are a plus
  • Experience building automations and CICD pipelines (integration, testing, deployment)
  • Experience with Kubernetes would be a plus.
  • Good understanding of working with distributed teams using Agile: Scrum, Kanban
  • Strong interpersonal skills as well as excellent written and verbal communication skills

• Attention to detail and quality, and the ability to work well in and across teams

Read more
goqii Technologies

at goqii Technologies

1 video
2 recruiters
Ambalika Handoo
Posted by Ambalika Handoo
Mumbai
1 - 5 yrs
₹3L - ₹12L / yr
skill iconAndroid Development
skill iconKotlin
skill iconJava
SQLite
Data modeling
+2 more
Job Description
Are you bored of writing banking apps, making people click on more ads, or re-skinning or making clones
of Temple Run.
Did you ever think you could use your skills to change the world?
If you consider yourself as more of an artist who paints on the technology canvas, we want you!!!
GOQii is your chance to work with an amazing team at GOQii who are driven by passion and here to
disrupt the wearable technology & fitness space.
Roles and Responsibilities:
 Relevant Experience on native App Development.
 Solid understanding of the full mobile development life cycle.
 UI development with latest framework and techniques.
 Understanding of asynchronous client/server interfacing.
 Solid grip on SQLite and data modelling.
 Experience with 3rd party libraries & APIs - Social, Payment, Network, Crash, and Analytics etc.
 Experience in handling the performance and memory of App using various tools.
 Focus on building high performance, stable and maintainable code.
 A good logical and analytical skills.
 Experience with Git / SVN version control software.
 Thorough understanding of OOP concepts.
 Proficient with Java and Kotlin.
 Clear understanding of Android SDK, Android studio, APIs, DBs, Material Design.
 Realm and Room database.
 Understanding of design patterns.
 Background task, threading concept.
Read more
Ttec Digital
Hyderabad
1 - 12 yrs
₹6L - ₹25L / yr
Software Development
skill icon.NET
.NET Framework
skill iconXML
SOAP
+5 more

Position Description:

TTEC Digital is looking for enthusiastic Developers for Genesys Contact Center products and custom developed Cloud solutions.  As a Developer, you will function as an active member of the Development team in the following phases of a project’s lifecycle: Design, Build, Deploy, Accept, web and windows services, API’s and applications that integrate with our customers back end CRM systems, databases, and external 3rd party API’s.

Responsibilities:

  • Works with customers as needed to translate design requirements into application solutions, ensuring the requirements are met according to the team’s and practice area’s standards and best practices.
  • Communicates with project manager/client to identify application requirements.
  • Ensures applications meet the standards and requirements of both the client and project manager.
  • Conducts tests of the application for functionality, reliability and stabilization.
  • Deploys/implements the application to the client.
  • Maintains and supports existing applications by fixing problems, addressing issues and determining the need for enhancements.
  • Demonstrates concern for meeting client needs in a manner that provides satisfaction and excellent results for the client, leading to additional opportunities within the client account.
  • Performs all tasks within the budget and on time while meeting all necessary project requirements. Communicates regularly if budget and/or scope changes.
  • Demonstrate professionalism and leadership in representing the Company to customers and vendors.
  • Core PureConnect handler development & maintenance.
  • Monitor and respond to system errors. Participate in on-call rotation.
  • Follow-up on and resolve outstanding issues in a timely manner.
  • Update customer to reflect changes in system configuration as needed.
  • Understand system hardware/software to be able to identify problems and provide a remedy.
  • Handle TAC/Engineering escalations as directed by the team lead or team manager.

Requirements

  • Bachelor’s degree in computer science, business, or related area.
  • 3+ years of relevant experience and proven ability as a software developer.
  • Experience with the Microsoft development platform.
  • Experience with .NET Framework.
  • Professional experience with integration services including XML, SOAP, REST, TCP/IP, JavaScript, and HTML.
  • Deep Understanding of application architecture.
  • Familiarity in data modeling and architecture.
  • Deep expertise and familiarity with the Pure Cloud development platform.

We offer an outstanding career development opportunity, a competitive salary along with full comprehensive benefits.  We are looking for individuals with a team player attitude, strong drive for career growth and a passion for excellence in client support, delivery, and satisfaction.

Read more
DASCASE
Remote only
12 - 20 yrs
₹24L - ₹40L / yr
API management
Windows Azure
skill iconSpring Boot
Microservices
Cloud Computing
+4 more

API Lead Developer

 

Job Overview:

As an API developer for a very large client,  you will be filling the role of a hands-on Azure API Developer. we are looking for someone who has the necessary technical expertise to build and maintain sustainable API Solutions to support identified needs and expectations from the client.

 

Delivery Responsibilities

  • Implement an API architecture using Azure API Management, including security, API Gateway, Analytics, and API Services
  • Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects
  • Conduct functional, regression, and load testing on API’s
  • Gather requirements and defining the strategy for application integration
  • Develop using the following types of Integration protocols/principles: SOAP and Web services stack, REST APIs, RESTful, RPC/RFC
  • Analyze, design, and coordinate the development of major components of the APIs including hands on implementation, testing, review, build automation, and documentation
  • Work with DevOps team to package release components to deploy into higher environment

Required Qualifications

  • Expert Hands-on experience in the following:
    • Technologies such as Spring Boot, Microservices, API Management & Gateway, Event Streaming, Cloud-Native Patterns, Observability & Performance optimizations
    • Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
  • At least 5+ years of experience with Azure APIM
  • At least 8+ years’ experience in Azure SaaS and PaaS
  • At least 8+ years’ experience in API Management including technologies such as Mulesoft and Apigee
  • At least last 5 years in consulting with the latest implementation on Azure SaaS services
  • At least 5+ years in MS SQL / MySQL development including data modeling, concurrency, stored procedure development and tuning
  • Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
  • High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
  • Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts

 

Preferred Qualifications:

  • Ability to work as a collaborative team, mentoring and training the junior team members
  • Working knowledge on building and working on/around data integration / engineering / Orchestration
  • Position requires expert knowledge across multiple platforms, integration patterns, processes, data/domain models, and architectures.
  • Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
  • Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
  • Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
  • Experience with Agile Methodology / Scaled Agile Framework (SAFe).
  • Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.

 

Preferred Education/Skills:

  • Prefer Master’s degree
  • Bachelor’s Degree in Computer Science with a minimum of 12+ years relevant experience or equivalent.
Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
Nickelfox

at Nickelfox

1 recruiter
Aakriti Ishwar
Posted by Aakriti Ishwar
Noida
5 - 10 yrs
₹1L - ₹21L / yr
skill iconPython
skill iconDjango
skill iconFlask
Data modeling
Design patterns
+2 more
Job Description
  • Lead multiple client projects in the organization. 
  • Define & build technical architecture for projects. 
  • Introduce & ensure right coding standards within the team and ensure that it is maintained in all the projects. 
  • Ensure the quality delivery of projects. 
  • Ensure applications are confirmed to security guidelines wherever required. 
  • Assist pre-sales/sales team for converting raw requirements from potential clients to functional solutions. 
  • Train fellow team members to impose the best practices available. 
  • Work on improving and managing processes within the team. 
  • Implement innovative ideas throughout the team to improve the overall efficiency and quality of the team. 
  • Ensure proper communication & collaboration within the team 


Requirements

7+ Years of experience in developing large scale applications. 
Solid Domain Knowledge and experience with various Design Patterns & Data Modelling (Associations, OOPs Concepts etc.)
Should have exposure to multiple backend technologies, and databases - both relational and NOSQL databases
Should be aware of latest conventions for APIs
Preferred hands-on experience with GraphQL as well as REST API
Must be well-aware of the latest technological advancements for relevant platforms. 
Advanced Concepts of Databases - Views, Stored Procedures, Database Optimization - are a good to have.
Should have Research Oriented Approach 
Solid at Logical thinking and Problem solving 
Solid understanding of Coding Standards, Code Review processes and delivery of quality products 
Experience with various Tools used in Development, Tests & Deployments. 
Sound knowledge of DevOps and CI/CD Pipeline Tools 
Solid experience with Git Workflow on Enterprise projects and larger teams 
Should be good at documentation at project level and code level ; Should have good experience with Agile Methodology and process
Should have a good understanding of server side deployment, scalability, maintainability and handling server security problems.
Should have good understanding on Software UX 
Proficient with communication and good at making software architectural judgments
Expected outcomes 

  • Growing the team and retaining talent, thus, creating an inspiring environment for the team members. 
  • Creating more leadership within the team along with mentoring and guiding new joiners and experienced developers. 
  • Creating growth plans for the team and preparing training guides for other team members. 
  • Refining processes in the team on a regular basis to ensure quality delivery of projects- such as coding standards, project collaboration, code review processes etc. 
  • Improving overall efficiency and team productivity by introducing new methodologies and ideas in the team. 
  • Working on R&D and employing innovative technologies in the company. 
  • Streamlining processes which will result in saving time and cost optimization 
  • Ensuring code review healthiness and shipping superior quality code 


 


Benefits

  • Unlimited learning and growth opportunities 
  • A collaborative and cheerful work environment 
  • Exceptional reward and recognition policy  
  • Outstanding compensation  
  • Flexible work hours  
  • Opportunity to make an impact as your work will directly contribute to our business strategy.

At Nickelfox, you have a chance to craft a career path as unique as you are and become the best version of YOU. You will be part of a team with a ‘no limits’ mindset in an inclusive, people-focused culture. And we’re counting on your unique perspective to help Nickelfox grow even faster.  

Are you passionate about tech? Dedicated to learning? Come, join us to build an extraordinary experience for yourself and a dignified working world for all. 
 

What makes Nickelfox a great place for you?

In Nickelfox, you’ll join a team whose passion for technology and understanding of business has driven the company to serve clients across 25+ countries in just five years. We partner with our customers to fuel their growth story and enable them to make the right decisions with our customized technology services and insights. All in all, we are passionate to see our customers win the day. This is the reason why 80% of our business comes from repeat clients.  

Our mission is to provide dignified employment and an environment that recognizes the uniqueness of every individual and values their expertise, and contribution. We have a culture that encourages everyone to bring their authentic selves to work. Our people enjoy a collaborative work environment with exceptional training and career development. If you like working with a curious, youthful, high-performing team, Nickelfox is the place for you.

Read more
T500

T500

Agency job
via Talent500 by ANSR by Raghu R
Bengaluru (Bangalore)
3 - 9 yrs
₹10L - ₹30L / yr
Informatica MDM
Data modeling
IDQ

Primary Duties and Responsibilities 

  • Experience with Informatica Multidomain MDM 10.4 tool suite preferred
  • Partnering with data architects and engineers to ensure an optimal data model design and implementation for each MDM domain in accordance with industry and MDM best practices
  • Works with data governance and business steward(s) to design, develop, and configure business rules for data validation, standardize, match, and merge
  • Implementation of Data Quality policies, procedures and standards along with Data Governance Team for maintenance of customer, location, product, and other data domains; Experience with Informatica IDQ tool suite preferred.
  • Performs data analysis and source-to-target mapping for ingest and egress of data.
  • Maintain compliance with change control, SDLC, and development standards.
  • Champion the creation and contribution to technical documentation and diagrams.
  • Establishes a technical vision and strategy with the team and works with the team to turn it into reality.
  • Emphasis on coaching and training to cultivate skill development of team members within the department.
  • Responsible for keeping up with industry best practices and trends.
  • Monitor, troubleshoot, maintain, and continuously improve the MDM ecosystem.

Secondary Duties and Responsibilities

  • May participate in off-hours on-call rotation.
  • Attends and is prepared to participate in team, department and company meetings.
  • Performs other job related duties and special projects as assigned.

Supervisory Responsibilities

This is a non-management role

Education and Experience

  • Bachelor's degree in MIS, Computer Sciences, Business Administration, or related field; or High School Degree/General Education Diploma and 4 years of relevant experience in lieu of Bachelor's degree.
  • 5+ years of experience in implementing MDM solutions using Informatica MDM.
  • 2+ years of experience in data stewardship, data governance, and data management concepts.
  • Professional working knowledge of Customer 360 solution
  • Professional working knowledge in multi domain MDM data modeling.
  • Strong understanding of company master data sets and their application in complex business processes and support data profiling, extraction, and cleansing activities using Informatica Data Quality (IDQ).
  • Strong knowledge in the installation and configuration of the Informatica MDM Hub.
  • Familiarity with real-time, near real-time and batch data integration.
  • Strong experience and understanding of Informatica toolsets including Informatica MDM Hub, Informatica Data Quality (IDQ), Informatica Customer 360, Informatica EDC, Hierarchy Manager (HM), Business Entity Service Model, Address Doctor, Customizations & Composite Services
  • Experience with event-driven architectures (e.g. Kafka, Google Pub/Sub, Azure Event Hub, etc.).
  • Professional working knowledge of CI/CD technologies such as Concourse, TeamCity, Octopus, Jenkins, and CircleCI.
  • Team player that exhibits high energy, strategic thinking, collaboration, direct communication and results orientation.

Physical Requirements

  • Visual requirements include: ability to see detail at near range with or without correction. Must be physically able to perform sedentary work: occasionally lifting or carrying objects of no more than 10 pounds, and occasionally standing or walking, reaching, handling, grasping, feeling, talking, hearing and repetitive motions.

Working Conditions

  • The duties of this position are performed through a combination of an open office setting and remote work options. Full remote work options available for employees that reside outside of the Des Moines Metro Area. There is frequent pressure to meet deadlines and handle multiple projects in a day.

Equipment Used to Perform Job

  • Windows, or Mac computer and various software solutions.

Financial Responsibility

  • Responsible for company assets including maintenance of software solutions.

Contacts

  • Has frequent contact with office personnel in other departments related to the position, as well as occasional contact with users and customers. Engages stakeholders from other area in the business.

Confidentiality

  • Has access to confidential information including trade secrets, intellectual property, various financials, and customer data.
Read more
Encubate Tech Private Ltd

Encubate Tech Private Ltd

Agency job
via staff hire solutions by Purvaja Patidar
Mumbai
5 - 6 yrs
₹15L - ₹20L / yr
skill iconAmazon Web Services (AWS)
Amazon Redshift
Data modeling
ITL
Agile/Scrum
+7 more

Roles and

Responsibilities

Seeking AWS Cloud Engineer /Data Warehouse Developer for our Data CoE team to

help us in configure and develop new AWS environments for our Enterprise Data Lake,

migrate the on-premise traditional workloads to cloud. Must have a sound

understanding of BI best practices, relational structures, dimensional data modelling,

structured query language (SQL) skills, data warehouse and reporting techniques.

 Extensive experience in providing AWS Cloud solutions to various business

use cases.

 Creating star schema data models, performing ETLs and validating results with

business representatives

 Supporting implemented BI solutions by: monitoring and tuning queries and

data loads, addressing user questions concerning data integrity, monitoring

performance and communicating functional and technical issues.

Job Description: -

This position is responsible for the successful delivery of business intelligence

information to the entire organization and is experienced in BI development and

implementations, data architecture and data warehousing.

Requisite Qualification

Essential

-

AWS Certified Database Specialty or -

AWS Certified Data Analytics

Preferred

Any other Data Engineer Certification

Requisite Experience

Essential 4 -7 yrs of experience

Preferred 2+ yrs of experience in ETL & data pipelines

Skills Required

Special Skills Required

 AWS: S3, DMS, Redshift, EC2, VPC, Lambda, Delta Lake, CloudWatch etc.

 Bigdata: Databricks, Spark, Glue and Athena

 Expertise in Lake Formation, Python programming, Spark, Shell scripting

 Minimum Bachelor’s degree with 5+ years of experience in designing, building,

and maintaining AWS data components

 3+ years of experience in data component configuration, related roles and

access setup

 Expertise in Python programming

 Knowledge in all aspects of DevOps (source control, continuous integration,

deployments, etc.)

 Comfortable working with DevOps: Jenkins, Bitbucket, CI/CD

 Hands on ETL development experience, preferably using or SSIS

 SQL Server experience required

 Strong analytical skills to solve and model complex business requirements

 Sound understanding of BI Best Practices/Methodologies, relational structures,

dimensional data modelling, structured query language (SQL) skills, data

warehouse and reporting techniques

Preferred Skills

Required

 Experience working in the SCRUM Environment.

 Experience in Administration (Windows/Unix/Network/Database/Hadoop) is a

plus.

 Experience in SQL Server, SSIS, SSAS, SSRS

 Comfortable with creating data models and visualization using Power BI

 Hands on experience in relational and multi-dimensional data modelling,

including multiple source systems from databases and flat files, and the use of

standard data modelling tools

 Ability to collaborate on a team with infrastructure, BI report development and

business analyst resources, and clearly communicate solutions to both

technical and non-technical team members

Read more
Thoughtworks

at Thoughtworks

1 video
27 recruiters
Vidyashree Kulkarni
Posted by Vidyashree Kulkarni
Remote only
9 - 15 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+4 more
Data Engineers develop modern data architecture approaches to meet key business objectives and provide end-to-end data solutions. You might spend a few weeks with a new client on a deep technical review or a complete organizational review, helping them to understand the potential that data brings to solve their most pressing problems. On other projects, you might be acting as the architect, leading the design of technical solutions or perhaps overseeing a program inception to build a new product. It could also be a software delivery project where you're equally happy coding and tech-leading the team to implement the solution.

Job responsibilities
  • You will partner with teammates to create complex data processing pipelines in order to solve our clients' most complex challenges
  • You will collaborate with Data Scientists in order to design scalable implementations of their models
  • You will pair to write clean and iterative code based on TDD
  • Leverage various continuous delivery practices to deploy, support and operate data pipelines
  • Advise and educate clients on how to use different distributed storage and computing technologies from the plethora of options available
  • Develop and operate modern data architecture approaches to meet key business objectives and provide end-to-end data solutions
  • Create data models and speak to the tradeoffs of different modeling approaches
  • Seamlessly incorporate data quality into your day-to-day work as well as into the delivery process
  • Assure effective collaboration between Thoughtworks' and the client's teams, encouraging open communication and advocating for shared outcomes
Job qualifications

Technical skills

  • You have a good understanding of data modelling and experience with data engineering tools and platforms such as Kafka, Spark, and Hadoop
  • You have built large-scale data pipelines and data-centric applications using any of the distributed storage platforms such as HDFS, S3, NoSQL databases (Hbase, Cassandra, etc.) and any of the distributed processing platforms like Hadoop, Spark, Hive, Oozie, and Airflow in a production setting
  • Hands on experience in MapR, Cloudera, Hortonworks and/or cloud (AWS EMR, Azure HDInsights, Qubole etc.) based Hadoop distributions
  • You are comfortable taking data-driven approaches and applying data security strategy to solve business problems
  • Working with data excites you: you can build and operate data pipelines, and maintain data storage, all within distributed systems
  • You're genuinely excited about data infrastructure and operations with a familiarity working in cloud environments
  • Professional skills
  • You're resilient and flexible in ambiguous situations and enjoy solving problems from technical and business perspectives
  • An interest in coaching, sharing your experience and knowledge with teammates
  • You enjoy influencing others and always advocate for technical excellence while being open to change when needed
  • Presence in the external tech community: you willingly share your expertise with others via speaking engagements, contributions to open source, blogs and more
Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹15L / yr
MySQL
skill iconMongoDB
Data modeling
API
Apache Kafka
+2 more

What is the role?

You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.

Key Responsibilities

  • Design and develop resilient data pipelines.
  • Write efficient queries to fetch data from the report database.
  • Work closely with application backend engineers on data requirements for their stories.
  • Designing and developing report APIs for the front end to consume.
  • Focus on building highly available, fault-tolerant report systems.
  • Constantly improve the architecture of the application by clearing the technical backlog. 
  • Adopt a culture of learning and development to constantly keep pace with and adopt new technolgies.

What are we looking for?

An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.

  • Education - BE/MCA or equivalent
  • Overall 8+ years of experience
  • Expert level understanding of database concepts and BI.
  • Well verse in databases such as MySQL, MongoDB and hands on experience in creating data models. 
  • Must have designed and implemented low latency data warehouse systems.
  • Must have strong understanding of Kafka and related systems.
  • Experience in clickhouse database preferred.
  • Must have good knowledge of APIs and should be able to build interfaces for frontend engineers.
  • Should be innovative and communicative in approach
  • Will be responsible for functional/technical track of a project

Whom will you work with?

You will work with a top-notch tech team, working closely with the CTO and product team.  

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

 

Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Wellness Forever Medicare Private Limited
Mumbai
3 - 5 yrs
₹7L - ₹11L / yr
Data Warehouse (DWH)
Informatica
ETL
SQL server
Microsoft Windows Azure
+4 more
  • Minimum 3-4 years of experience with ETL tools, SQL, SSAS & SSIS
  • Good understanding of Data Governance, including Master Data Management (MDM) and Data Quality tools and processes
  • Knowledge of programming languages eg. JASON, Python, R
  • Hands on experience of SQL database design
  • Experience working with REST API
  • Influencing and supporting project delivery through involvement in project/sprint planning and QA
  • Working experience with Azure
  • Stakeholder management
  • Good communication skills
Read more
Crayon Data

at Crayon Data

2 recruiters
Varnisha Sethupathi
Posted by Varnisha Sethupathi
Chennai
5 - 8 yrs
₹15L - ₹25L / yr
SQL
skill iconPython
Analytical Skills
Data modeling
Data Visualization
+1 more

Role : Senior Customer Scientist 

Experience : 6-8 Years 

Location : Chennai (Hybrid) 
 
 

Who are we? 
 
 

A young, fast-growing AI and big data company, with an ambitious vision to simplify the world’s choices. Our clients are top-tier enterprises in the banking, e-commerce and travel spaces. They use our core AI-based choice engine http://maya.ai/">maya.ai, to deliver personal digital experiences centered around taste. The http://maya.ai/">maya.ai platform now touches over 125M customers globally. You’ll find Crayon Boxes in Chennai and Singapore. But you’ll find Crayons in every corner of the world. Especially where our client projects are – UAE, India, SE Asia and pretty soon the US. 
 
 

Life in the Crayon Box is a little chaotic, largely dynamic and keeps us on our toes! Crayons are a diverse and passionate bunch. Challenges excite us. Our mission drives us. And good food, caffeine (for the most part) and youthful energy fuel us. Over the last year alone, Crayon has seen a growth rate of 3x, and we believe this is just the start. 
 

 
We’re looking for young and young-at-heart professionals with a relentless drive to help Crayon double its growth. Leaders, doers, innovators, dreamers, implementers and eccentric visionaries, we have a place for you all. 
 

 
 

Can you say “Yes, I have!” to the below? 
 
 

  1. Experience with exploratory analysis, statistical analysis, and model development 
     
  1. Knowledge of advanced analytics techniques, including Predictive Modelling (Logistic regression), segmentation, forecasting, data mining, and optimizations 
     
  1. Knowledge of software packages such as SAS, R, Rapidminer for analytical modelling and data management. 
     
  1. Strong experience in SQL/ Python/R working efficiently at scale with large data sets 
     
  1. Experience in using Business Intelligence tools such as PowerBI, Tableau, Metabase for business applications 
     

 
 

 

Can you say “Yes, I will!” to the below? 
 
 

  1. Drive clarity and solve ambiguous, challenging business problems using data-driven approaches. Propose and own data analysis (including modelling, coding, analytics) to drive business insight and facilitate decisions. 
     
  1. Develop creative solutions and build prototypes to business problems using algorithms based on machine learning, statistics, and optimisation, and work with engineering to deploy those algorithms and create impact in production. 
     
  1. Perform time-series analyses, hypothesis testing, and causal analyses to statistically assess the relative impact and extract trends 
     
  1. Coordinate individual teams to fulfil client requirements and manage deliverable 
     
  1. Communicate and present complex concepts to business audiences 
     
  1. Travel to client locations when necessary  

 

 

Crayon is an equal opportunity employer. Employment is based on a person's merit and qualifications and professional competences. Crayon does not discriminate against any employee or applicant because of race, creed, color, religion, gender, sexual orientation, gender identity/expression, national origin, disability, age, genetic information, marital status, pregnancy or related.  
 
 

More about Crayon: https://www.crayondata.com/">https://www.crayondata.com/  
 

More about http://maya.ai/">maya.ai: https://maya.ai/">https://maya.ai/  

 

 

Read more
Wissen Technology

at Wissen Technology

4 recruiters
Lokesh Manikappa
Posted by Lokesh Manikappa
Bengaluru (Bangalore)
5 - 12 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data modeling
Spark
+5 more

Job Description

The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.

Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.

You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies

 

Skills /Expertise Required :

Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).

Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.

Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.

Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills

Read more
A fast-growing SaaS commerce company permanent WFH & Office

A fast-growing SaaS commerce company permanent WFH & Office

Agency job
via Jobdost by Mamatha A
Remote only
10 - 15 yrs
₹40L - ₹60L / yr
Engineering Management
skill iconNodeJS (Node.js)
skill iconReact Native
GraphQL
skill iconElastic Search
+9 more

Job Description

 

What is the role?

Expected to manage the product plan, engineering, and delivery of Plum Integration activities. Plum is a rewarding and incentive infrastructure for businesses. It's a unified integrated suite of products to handle various rewarding use cases for consumers, sales, channel partners, and employees. 31% of the total tech team is aligned towards this product and comprises 32 members within Plum Tech, Quality, Design, and Product management. The annual FY 2019-20 revenue for Plum was $ 40MN and is showing high growth potential this year as well. The product has a good mix of both domestic and international clientele and is expanding. The role will be based out of our head office in Bangalore, Karnataka however we are open to discuss the option of remote working with 25 - 50% travel.

Key Responsibilities

  • Scope and lead technology with the right product and business metrics.
  • Directly contribute to product development by writing code if required.
  • Architect systems for scale and stability.
  • Serve as a role model for our high engineering standards and bring consistency to the many codebases and processes you will encounter.
  • Collaborate with stakeholders across disciplines like sales, customers, product, design, and customer success.
  • Code reviews and feedback.
  • Build simple solutions and designs over complex ones and have a good intuition for what is lasting and scalable.
  • Define a process for maintaining a healthy engineering culture (Cadence for one-on-ones, meeting structures, HLDs, Best Practices In development, etc.).

What are we looking for?

  • Manage a senior tech team of more than 5 direct and 10 indirect developers.
  • Should have experience in handling e-commerce applications at scale.
  • Should have experience working with applications like HubSpot salesforce and other CRM.
  • Should have experience in B2B integrations.
  • Should have at least 10+ years of experience in software development, agile processes for international e-commerce businesses.
  • Should be extremely hands-on, with an Automate as much as possible mind set full-stack developer.
  • Should exhibit skills to build a good engineering team and culture.
  • Should be able to handle the chaos with product planning, prioritizing, customer-first approach.
  • Technical proficiency
  • Frameworks like React, React Native, Node.js, GraphQL
  • Databases technologies like Elasticsearch, Redis, MySQL, MongoDB, Kafka
  • Dev ops to manage and architect infra - AWS, CI/CD (Jenkins)
  • System Architecture w.r.t Microservices, Cloud Development, DB Administration, Data Modeling
  • Understanding of security principles and possible attacks and mitigate them.

Whom will you work with?

You will lead the Plum Integration Engineering team and work in close conjunction with the Tech leads of Plum with some cross-functional stake with other products. Your will report to the CTO directly.

‍What can you look for?

‍ A wholesome opportunity in a fast-paced environment with scale, international flavor, backend, and frontend. Work with a team of highly talented young professionals and enjoy the benefits.
Read more
EASEBUZZ

at EASEBUZZ

1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
A Pre-series A funded FinTech Company

A Pre-series A funded FinTech Company

Agency job
via GoHyre by Avik Majumder
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
Digital B2B Platform

Digital B2B Platform

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
6 - 12 yrs
₹60L - ₹80L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Problem solving
+3 more

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.

 

Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

● Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Digital B2B Platform

Digital B2B Platform

Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
3 - 5 yrs
₹30L - ₹45L / yr
Data modeling
skill iconJava
Spring
Microservices
SQL

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.


Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort