Cutshort logo
Data modeling Jobs in Bangalore (Bengaluru)

30+ Data modeling Jobs in Bangalore (Bengaluru) | Data modeling Job openings in Bangalore (Bengaluru)

Apply to 30+ Data modeling Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest Data modeling Job opportunities across top companies like Google, Amazon & Adobe.

icon
Cognitive Clouds Software Pvt Ltd

at Cognitive Clouds Software Pvt Ltd

1 video
6 recruiters
Talent Acquisition
Posted by Talent Acquisition
Bengaluru (Bangalore)
4 - 6 yrs
Best in industry
Snow flake schema
ETL
Data modeling

We are seeking a skilled Data Engineer with a strong proficiency in SQL and extensive experience in data modeling. The ideal candidate will be adept at designing and implementing robust data architectures, including snowflake schemas, ER diagrams, and various types of tables such as transaction, dimension, surrogate keys, foreign keys, and primary keys.


  • Over 4 years of experience as a data engineer or in a similar role.
  • Technical expertise with data models, data mining, and segmentation techniques
  • Knowledge of programming languages (e.g. Java and Python)
  • Hands-on experience with SQL database design
  • Develop and maintain efficient SQL queries for data extraction, transformation, and loading (ETL) processes.
  • Design and implement data models, including snowflake schemas and ER diagrams, to support business requirements.
  • Collaborate with cross-functional teams to understand data needs and requirements, and translate them into scalable database solutions.
Read more
Arting Digital
Pragati Bhardwaj
Posted by Pragati Bhardwaj
Bengaluru (Bangalore)
10 - 16 yrs
₹10L - ₹15L / yr
databricks
Data modeling
SQL
skill iconPython
AWS Lambda
+2 more

Title:- Lead Data Engineer 


Experience: 10+y

Budget: 32-36 LPA

Location: Bangalore 

Work of Mode: Work from office

Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS

Qualification: Any Engineering degree


Roles and Responsibilities:


• 8 - 10+ years’ experience in developing scalable Big Data applications or solutions on

 distributed platforms.

• Able to partner with others in solving complex problems by taking a broad

 perspective to identify.

• innovative solutions.

• Strong skills building positive relationships across Product and Engineering.

• Able to influence and communicate effectively, both verbally and written, with team

  members and business stakeholders

• Able to quickly pick up new programming languages, technologies, and frameworks.

• Experience working in Agile and Scrum development process.

• Experience working in a fast-paced, results-oriented environment.

• Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2,

  IAM etc.

• Experience working with Data Warehousing tools, including SQL database, Presto,

  and Snowflake

• Experience architecting data product in Streaming, Serverless and Microservices

  Architecture and platform.

• Experience working with Data platforms, including EMR, Airflow, Databricks (Data

  Engineering & Delta

• Lake components, and Lakehouse Medallion architecture), etc.

• Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for

  Managed Spark jobs, build Docker images, etc.

• Experience working with distributed technology tools, including Spark, Python, Scala

• Working knowledge of Data warehousing, Data modelling, Governance and Data

  Architecture

• Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite

  etc.

• Demonstrated experience in learning new technologies and skills.

• Bachelor’s degree in computer science, Information Systems, Business, or other

  relevant subject area

Read more
hopscotch
Bengaluru (Bangalore)
5 - 8 yrs
₹6L - ₹15L / yr
skill iconPython
Amazon Redshift
skill iconAmazon Web Services (AWS)
PySpark
Data engineering
+3 more

About the role:

 Hopscotch is looking for a passionate Data Engineer to join our team. You will work closely with other teams like data analytics, marketing, data science and individual product teams to specify, validate, prototype, scale, and deploy data pipelines features and data architecture.


Here’s what will be expected out of you:

➢ Ability to work in a fast-paced startup mindset. Should be able to manage all aspects of data extraction transfer and load activities.

➢ Develop data pipelines that make data available across platforms.

➢ Should be comfortable in executing ETL (Extract, Transform and Load) processes which include data ingestion, data cleaning and curation into a data warehouse, database, or data platform.

➢ Work on various aspects of the AI/ML ecosystem – data modeling, data and ML pipelines.

➢ Work closely with Devops and senior Architect to come up with scalable system and model architectures for enabling real-time and batch services.


What we want:

➢ 5+ years of experience as a data engineer or data scientist with a focus on data engineering and ETL jobs.

➢ Well versed with the concept of Data warehousing, Data Modelling and/or Data Analysis.

➢ Experience using & building pipelines and performing ETL with industry-standard best practices on Redshift (more than 2+ years).

➢ Ability to troubleshoot and solve performance issues with data ingestion, data processing & query execution on Redshift.

➢ Good understanding of orchestration tools like Airflow.

 ➢ Strong Python and SQL coding skills.

➢ Strong Experience in distributed systems like spark.

➢ Experience with AWS Data and ML Technologies (AWS Glue,MWAA, Data Pipeline,EMR,Athena, Redshift,Lambda etc).

➢ Solid hands on with various data extraction techniques like CDC or Time/batch based and the related tools (Debezium, AWS DMS, Kafka Connect, etc) for near real time and batch data extraction.


Note :

Product based companies, Ecommerce companies is added advantage

Read more
Bengaluru (Bangalore)
3 - 10 yrs
₹30L - ₹50L / yr
Data engineering
Data modeling
skill iconPython

Requirements:

  • 2+ years of experience (4+ for Senior Data Engineer) with system/data integration, development or implementation of enterprise and/or cloud software Engineering degree in Computer Science, Engineering or related field.
  • Extensive hands-on experience with data integration/EAI technologies (File, API, Queues, Streams), ETL Tools and building custom data pipelines.
  • Demonstrated proficiency with Python, JavaScript and/or Java
  • Familiarity with version control/SCM is a must (experience with git is a plus).
  • Experience with relational and NoSQL databases (any vendor) Solid understanding of cloud computing concepts.
  • Strong organisational and troubleshooting skills with attention to detail.
  • Strong analytical ability, judgment and problem-solving techniques Interpersonal and communication skills with the ability to work effectively in a cross functional team.


Read more
Molecular Connections

at Molecular Connections

4 recruiters
Molecular Connections
Posted by Molecular Connections
Bengaluru (Bangalore)
8 - 10 yrs
₹15L - ₹20L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+4 more
  1. Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
  2. A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
  3. Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
  4. Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
  5. Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
  6. Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
  7. Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
  8. Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
  9. Exposure to Cloudera development environment and management using Cloudera Manager.
  10. Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
  11. Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
  12. Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
  13. Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
  14. Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
  15. Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
  16. Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
  17. Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
  18. In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
  19. Hands on expertise in real time analytics with Apache Spark.
  20. Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
  21. Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
  22. Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
  23. Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
  24. Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
  25. Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
  26. Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
  27. Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
  28. In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
  29. Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis. 
  30. Generated various kinds of knowledge reports using Power BI based on Business specification. 
  31. Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
  32. Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
  33. Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
  34. Good experience with use-case development, with Software methodologies like Agile and Waterfall.
  35. Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
  36. Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
  37. Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
  38. Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
  39. Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
Read more
Mobile Programming LLC

at Mobile Programming LLC

1 video
34 recruiters
Sukhdeep Singh
Posted by Sukhdeep Singh
Bengaluru (Bangalore)
4 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
Snow flake schema
Snowflake
+5 more

Job Title: AWS-Azure Data Engineer with Snowflake

Location: Bangalore, India

Experience: 4+ years

Budget: 15 to 20 LPA

Notice Period: Immediate joiners or less than 15 days

Job Description:

We are seeking an experienced AWS-Azure Data Engineer with expertise in Snowflake to join our team in Bangalore. As a Data Engineer, you will be responsible for designing, implementing, and maintaining data infrastructure and systems using AWS, Azure, and Snowflake. Your primary focus will be on developing scalable and efficient data pipelines, optimizing data storage and processing, and ensuring the availability and reliability of data for analysis and reporting.

Responsibilities:

  1. Design, develop, and maintain data pipelines on AWS and Azure to ingest, process, and transform data from various sources.
  2. Optimize data storage and processing using cloud-native services and technologies such as AWS S3, AWS Glue, Azure Data Lake Storage, Azure Data Factory, etc.
  3. Implement and manage data warehouse solutions using Snowflake, including schema design, query optimization, and performance tuning.
  4. Collaborate with cross-functional teams to understand data requirements and translate them into scalable and efficient technical solutions.
  5. Ensure data quality and integrity by implementing data validation, cleansing, and transformation processes.
  6. Develop and maintain ETL processes for data integration and migration between different data sources and platforms.
  7. Implement and enforce data governance and security practices, including access control, encryption, and compliance with regulations.
  8. Collaborate with data scientists and analysts to support their data needs and enable advanced analytics and machine learning initiatives.
  9. Monitor and troubleshoot data pipelines and systems to identify and resolve performance issues or data inconsistencies.
  10. Stay updated with the latest advancements in cloud technologies, data engineering best practices, and emerging trends in the industry.

Requirements:

  1. Bachelor's or Master's degree in Computer Science, Information Systems, or a related field.
  2. Minimum of 4 years of experience as a Data Engineer, with a focus on AWS, Azure, and Snowflake.
  3. Strong proficiency in data modelling, ETL development, and data integration.
  4. Expertise in cloud platforms such as AWS and Azure, including hands-on experience with data storage and processing services.
  5. In-depth knowledge of Snowflake, including schema design, SQL optimization, and performance tuning.
  6. Experience with scripting languages such as Python or Java for data manipulation and automation tasks.
  7. Familiarity with data governance principles and security best practices.
  8. Strong problem-solving skills and ability to work independently in a fast-paced environment.
  9. Excellent communication and interpersonal skills to collaborate effectively with cross-functional teams and stakeholders.
  10. Immediate joiner or notice period less than 15 days preferred.

If you possess the required skills and are passionate about leveraging AWS, Azure, and Snowflake to build scalable data solutions, we invite you to apply. Please submit your resume and a cover letter highlighting your relevant experience and achievements in the AWS, Azure, and Snowflake domains.

Read more
Digisprint
Agency job
via Aanet Talent Global Solutions by Anita Ravishankar
Bengaluru (Bangalore)
5 - 10 yrs
₹20L - ₹30L / yr
skill iconJava
IBM WebSphere
IBM WebSphere Commerce
WCS
Solr
+9 more

About the role

For this role, the candidate needs to be responsible to work with a team to develop web platforms, solutions,

applications, and interfaces. Create and maintain websites including e-commerce platforms and custom applications.

Provide support for the optimization effort of applications, with emphasis on improvements to application

performance and page loading times.


Responsibilities

• Design, and develop solutions within a multi-functional Agile team to support business priority for HCL the

Commerce application (formerly IBM Websphere Commerce)

• Collaborate with UX designers, business solutions, product owners, and data services teams to design and

discover functional, architectural, and quality requirements.

• Educate and promote the use of best practices, while designing principles and frameworks to resolve issues.

• Share domain and technical expertise providing technical mentorship and cross-training to peers and team

members.

• Conduct design and code reviews of developed solutions and unit tests for optimization.

• Participate in the deployment process and architectural and design discussions with teams.

• Perform support for scheduled after-hours tasks related to maintenance, production, and nonproduction release

deployments.


Candidate requirements

1. Bachelor’s Degree, or equivalent experience in Engineering, Computer Science related field

2. Experience in WCS V7/ v8/v9, data model, contract, programming model, java, J2EE, Oracle DB/DB2, Linux/Shell

scripting, and JQuery.

3. Hands-on experience in WCS 9.x, 8.x, 7.x, versions

4. Strong Design and implementation experience in WebSphere Commerce 9.x,v8.x v7.x

5. Development Customization experience in SOLR.

6. Development-Integration with SOAP and REST services.

7. Working experience on any monitoring/observability tools, CICD pipelines like Jenkins, Solr and Performance

improvements.

8. Strong understanding of WebSphere commerce application architecture.

9. Experience in Integrating, designing and developing custom solutions within WebSphere Commerce.

10. Hands-on experience with utilizing various sub-systems of WebSphere Commerce Server (Catalog, Order,

Member, Payment).

11. Experience in working on different Business models of commerce (B2B, B2C, Extended sites).


About DigiSprint Solutions:

We are a global retail e-commerce start-up, having two decades of digital transformation industry experience. We

have successfully built world class ecommerce transformation solutions for leading retailers in US, UK, Mexico &

South Africa.

Our expertise includes domain consulting, technology consulting, enterprise architecture, leading platform solutions,

custom headless microservices and end to end testing across the systems. Our primary focus is to work with our

customers in a partnership mindset. We work with trending digital commerce technologies such as Oracle CX Cloud,

Adobe Commerce Cloud, Spring boot, Microservices, Java, etc.

Read more
Porter.in

at Porter.in

1 recruiter
Agency job
via UPhill HR by Ingit Pandey
Bengaluru (Bangalore)
4 - 8 yrs
₹15L - ₹28L / yr
skill iconPython
SQL
Data Visualization
Data modeling
Predictive modelling
+1 more

Responsibilities

This role requires a person to support business charters & accompanying products by aligning with the Analytics

Manager’s vision, understanding tactical requirements and helping in successful execution. Split would be approx.

70% management + 30% individual contributor. Responsibilities include


Project Management

- Understand business needs and objectives.

- Refine use cases and plan iterations and deliverables - able to pivot as required.

- Estimate efforts and conduct regular task updates to ensure timeline adherence.

- Set and manage stakeholder expectations as required


Quality Execution

- Help BA and SBA resources with requirement gathering and final presentations.

- Resolve blockers regarding technical challenges and decision-making.

- Check final deliverables for correctness and review codes, along with Manager.


KPIs and metrics

- Orchestrate metrics building, maintenance, and performance monitoring.

- Owns and manages data models, data sources, and data definition repo.

- Makes low-level design choices during execution.


Team Nurturing

- Help Analytics Manager during regular one-on-ones + check-ins + recruitment.

- Provide technical guidance whenever required.

- Improve benchmarking and decision-making skills at execution-level.

- Train and get new resources up-to-speed.

- Knowledge building (methodologies) to better position the team for complex problems.


Communication

- Upstream to document and discuss execution challenges, process inefficiencies, and feedback loops.

- Downstream and parallel for context-building, mentoring, stakeholder management.


Analytics Stack

- Analytics : Python / R + SQL + Excel / PPT, Colab notebooks

- Database : PostgreSQL, Amazon Redshift, DynamoDB, Aerospike

- Warehouse : Amazon Redshift

- ETL : Lots of Python + custom-made

- Business Intelligence / Visualization : Metabase + Python/R libraries (location data)

- Deployment pipeline : Docker, Git, Jenkins, AWS Lambda

Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
8 - 12 yrs
₹45L - ₹65L / yr
skill iconJavascript
SQL
NOSQL Databases
skill iconNodeJS (Node.js)
skill iconReact Native
+8 more

What is the role?

Expected to manage the product plan, engineering, and delivery of Xoxoday Plum. Plum is a rewarding and incentives infrastructure for businesses. It's a unified integrated suite of products to handle various rewarding use cases for consumers, sales, channel partners, and employees. 31% of the total tech team is aligned towards this product and comprises of 32 members within Plum Tech, Quality, Design, and Product management. The annual FY 2019-20 revenue for Plum was $ 40MN and is showing high growth potential this year as well. The product has a good mix of both domestic and international clientele and is expanding. The role will be based out of our head office in Bangalore, Karnataka however we are open to discuss the option of remote working with 25 - 50% travel.

Key Responsibilities

  • Scope and lead technology with the right product and business metrics.
  • Directly contribute to product development by writing code if required.
  • Architect systems for scale and stability.
  • Serve as a role model for our high engineering standards and bring consistency to the many codebases and processes you will encounter.
  • Collaborate with stakeholders across disciplines like sales, customers, product, design, and customer success.
  • Code reviews and feedback.
  • Build simple solutions and designs over complex ones, and have a good intuition for what is lasting and scalable.
  • Define a process for maintaining a healthy engineering culture ( Cadence for one-on-ones, meeting structures, HLDs, Best Practices In development, etc).

What are we looking for?

  • Manage a senior tech team of more than 5 direct and 25 indirect developers.
  • Should have experience in handling e-commerce applications at scale.
  • Should have at least 7+ years of experience in software development, agile processes for international e-commerce businesses.
  • Should be extremely hands-on, full-stack developer with modern architecture.
  • Should exhibit skills to build a good engineering team and culture.
  • Should be able to handle the chaos with product planning, prioritizing, customer-first approach.
  • Technical proficiency
  • JavaScript, SQL, NoSQL, PHP
  • Frameworks like React, ReactNative, Node.js, GraphQL
  • Databases technologies like ElasticSearch, Redis, MySql, Cassandra, MongoDB, Kafka
  • Dev ops to manage and architect infra - AWS, CI/CD (Jenkins)
  • System Architecture w.r.t Microservices, Cloud Development, DB Administration, Data Modeling
  • Understanding of security principles and possible attacks and mitigate them.

Whom will you work with?

You will lead the Plum Engineering team and work in close conjunction with the Tech leads of Plum with some cross-functional stake with other products. You'll report to the co-founder directly.

What can you look for?

A wholesome opportunity in a fast-paced environment with scale, international flavour, backend, and frontend. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

A fast-growing SaaS commerce company based in Bangalore with offices in Delhi, Mumbai, SF, Dubai, Singapore, and Dublin. We have three products in our portfolio: Plum, Empuls, and Compass. Xoxoday works with over 1000 global clients. We help our clients in engaging and motivating their employees, sales teams, channel partners, or consumers for better business results.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

Read more
Tredence
Bengaluru (Bangalore), Pune, Gurugram, Chennai
8 - 12 yrs
₹12L - ₹30L / yr
Snow flake schema
Snowflake
SQL
Data modeling
Data engineering
+1 more

JOB DESCRIPTION:. THE IDEAL CANDIDATE WILL:

• Ensure new features and subject areas are modelled to integrate with existing structures and provide a consistent view. Develop and maintain documentation of the data architecture, data flow and data models of the data warehouse appropriate for various audiences. Provide direction on adoption of Cloud technologies (Snowflake) and industry best practices in the field of data warehouse architecture and modelling.

• Providing technical leadership to large enterprise scale projects. You will also be responsible for preparing estimates and defining technical solutions to proposals (RFPs). This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project Roles & Responsibilities.

ELIGIBILITY CRITERIA: Desired Experience/Skills:
• Must have total 5+ yrs. in IT and 2+ years' experience working as a snowflake Data Architect and 4+ years in Data warehouse, ETL, BI projects.
• Must have experience at least two end to end implementation of Snowflake cloud data warehouse and 3 end to end data warehouse implementations on-premise preferably on Oracle.

• Expertise in Snowflake – data modelling, ELT using Snowflake SQL, implementing complex stored Procedures and standard DWH and ETL concepts
• Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these features
• Expertise in deploying Snowflake features such as data sharing, events and lake-house patterns
• Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
• Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
• Deep understanding of relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modelling)
• Experience with data security and data access controls and design
• Experience with AWS or Azure data storage and management technologies such as S3 and ADLS
• Build processes supporting data transformation, data structures, metadata, dependency and workload management
• Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot
• Provide resolution to an extensive range of complicated data pipeline related problems, proactively and as issues surface
• Must have expertise in AWS or Azure Platform as a Service (PAAS)
• Certified Snowflake cloud data warehouse Architect (Desirable)
• Should be able to troubleshoot problems across infrastructure, platform and application domains.
• Must have experience of Agile development methodologies
• Strong written communication skills. Is effective and persuasive in both written and oral communication

Nice to have Skills/Qualifications:Bachelor's and/or master’s degree in computer science or equivalent experience.
• Strong communication, analytical and problem-solving skills with a high attention to detail.

 

About you:
• You are self-motivated, collaborative, eager to learn, and hands on
• You love trying out new apps, and find yourself coming up with ideas to improve them
• You stay ahead with all the latest trends and technologies
• You are particular about following industry best practices and have high standards regarding quality

Read more
T500

at T500

Agency job
via Talent500 by ANSR by Raghu R
Bengaluru (Bangalore)
3 - 9 yrs
₹10L - ₹30L / yr
Informatica MDM
Data modeling
IDQ

Primary Duties and Responsibilities 

  • Experience with Informatica Multidomain MDM 10.4 tool suite preferred
  • Partnering with data architects and engineers to ensure an optimal data model design and implementation for each MDM domain in accordance with industry and MDM best practices
  • Works with data governance and business steward(s) to design, develop, and configure business rules for data validation, standardize, match, and merge
  • Implementation of Data Quality policies, procedures and standards along with Data Governance Team for maintenance of customer, location, product, and other data domains; Experience with Informatica IDQ tool suite preferred.
  • Performs data analysis and source-to-target mapping for ingest and egress of data.
  • Maintain compliance with change control, SDLC, and development standards.
  • Champion the creation and contribution to technical documentation and diagrams.
  • Establishes a technical vision and strategy with the team and works with the team to turn it into reality.
  • Emphasis on coaching and training to cultivate skill development of team members within the department.
  • Responsible for keeping up with industry best practices and trends.
  • Monitor, troubleshoot, maintain, and continuously improve the MDM ecosystem.

Secondary Duties and Responsibilities

  • May participate in off-hours on-call rotation.
  • Attends and is prepared to participate in team, department and company meetings.
  • Performs other job related duties and special projects as assigned.

Supervisory Responsibilities

This is a non-management role

Education and Experience

  • Bachelor's degree in MIS, Computer Sciences, Business Administration, or related field; or High School Degree/General Education Diploma and 4 years of relevant experience in lieu of Bachelor's degree.
  • 5+ years of experience in implementing MDM solutions using Informatica MDM.
  • 2+ years of experience in data stewardship, data governance, and data management concepts.
  • Professional working knowledge of Customer 360 solution
  • Professional working knowledge in multi domain MDM data modeling.
  • Strong understanding of company master data sets and their application in complex business processes and support data profiling, extraction, and cleansing activities using Informatica Data Quality (IDQ).
  • Strong knowledge in the installation and configuration of the Informatica MDM Hub.
  • Familiarity with real-time, near real-time and batch data integration.
  • Strong experience and understanding of Informatica toolsets including Informatica MDM Hub, Informatica Data Quality (IDQ), Informatica Customer 360, Informatica EDC, Hierarchy Manager (HM), Business Entity Service Model, Address Doctor, Customizations & Composite Services
  • Experience with event-driven architectures (e.g. Kafka, Google Pub/Sub, Azure Event Hub, etc.).
  • Professional working knowledge of CI/CD technologies such as Concourse, TeamCity, Octopus, Jenkins, and CircleCI.
  • Team player that exhibits high energy, strategic thinking, collaboration, direct communication and results orientation.

Physical Requirements

  • Visual requirements include: ability to see detail at near range with or without correction. Must be physically able to perform sedentary work: occasionally lifting or carrying objects of no more than 10 pounds, and occasionally standing or walking, reaching, handling, grasping, feeling, talking, hearing and repetitive motions.

Working Conditions

  • The duties of this position are performed through a combination of an open office setting and remote work options. Full remote work options available for employees that reside outside of the Des Moines Metro Area. There is frequent pressure to meet deadlines and handle multiple projects in a day.

Equipment Used to Perform Job

  • Windows, or Mac computer and various software solutions.

Financial Responsibility

  • Responsible for company assets including maintenance of software solutions.

Contacts

  • Has frequent contact with office personnel in other departments related to the position, as well as occasional contact with users and customers. Engages stakeholders from other area in the business.

Confidentiality

  • Has access to confidential information including trade secrets, intellectual property, various financials, and customer data.
Read more
xoxoday

at xoxoday

2 recruiters
Agency job
via Jobdost by Mamatha A
Bengaluru (Bangalore)
7 - 9 yrs
₹15L - ₹15L / yr
MySQL
skill iconMongoDB
Data modeling
API
Apache Kafka
+2 more

What is the role?

You will be responsible for building and maintaining highly scalable data infrastructure for our cloud-hosted SAAS product. You will work closely with the Product Managers and Technical team to define and implement data pipelines for customer-facing and internal reports.

Key Responsibilities

  • Design and develop resilient data pipelines.
  • Write efficient queries to fetch data from the report database.
  • Work closely with application backend engineers on data requirements for their stories.
  • Designing and developing report APIs for the front end to consume.
  • Focus on building highly available, fault-tolerant report systems.
  • Constantly improve the architecture of the application by clearing the technical backlog. 
  • Adopt a culture of learning and development to constantly keep pace with and adopt new technolgies.

What are we looking for?

An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.

  • Education - BE/MCA or equivalent
  • Overall 8+ years of experience
  • Expert level understanding of database concepts and BI.
  • Well verse in databases such as MySQL, MongoDB and hands on experience in creating data models. 
  • Must have designed and implemented low latency data warehouse systems.
  • Must have strong understanding of Kafka and related systems.
  • Experience in clickhouse database preferred.
  • Must have good knowledge of APIs and should be able to build interfaces for frontend engineers.
  • Should be innovative and communicative in approach
  • Will be responsible for functional/technical track of a project

Whom will you work with?

You will work with a top-notch tech team, working closely with the CTO and product team.  

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact, and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at Xoxoday.

We are

Xoxoday is a rapidly growing fintech SaaS firm that propels business growth while focusing on human motivation. Backed by Giift and Apis Partners Growth Fund II, Xoxoday offers a suite of three products - Plum, Empuls, and Compass. Xoxoday works with more than 2000 clients across 10+ countries and over 2.5 million users. Headquartered in Bengaluru, Xoxoday is a 300+ strong team with four global offices in San Francisco, Dublin, Singapore, New Delhi.

Way forward

We look forward to connecting with you. As you may take time to review this opportunity, we will wait for a reasonable time of around 3-5 days before we screen the collected applications and start lining up job discussions with the hiring manager. We however assure you that we will attempt to maintain a reasonable time window for successfully closing this requirement. The candidates will be kept informed and updated on the feedback and application status.

 

Read more
xpressbees
Alfiya Khan
Posted by Alfiya Khan
Pune, Bengaluru (Bangalore)
6 - 8 yrs
₹15L - ₹25L / yr
Big Data
Data Warehouse (DWH)
Data modeling
Apache Spark
Data integration
+10 more
Company Profile
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.

Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.

What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.

Qualifications & Experience relevant for the role

• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).

• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
Read more
Wissen Technology

at Wissen Technology

4 recruiters
Lokesh Manikappa
Posted by Lokesh Manikappa
Bengaluru (Bangalore)
5 - 12 yrs
₹15L - ₹35L / yr
ETL
Informatica
Data Warehouse (DWH)
Data modeling
Spark
+5 more

Job Description

The applicant must have a minimum of 5 years of hands-on IT experience, working on a full software lifecycle in Agile mode.

Good to have experience in data modeling and/or systems architecture.
Responsibilities will include technical analysis, design, development and perform enhancements.

You will participate in all/most of the following activities:
- Working with business analysts and other project leads to understand requirements.
- Modeling and implementing database schemas in DB2 UDB or other relational databases.
- Designing, developing, maintaining and Data processing using Python, DB2, Greenplum, Autosys and other technologies

 

Skills /Expertise Required :

Work experience in developing large volume database (DB2/Greenplum/Oracle/Sybase).

Good experience in writing stored procedures, integration of database processing, tuning and optimizing database queries.

Strong knowledge of table partitions, high-performance loading and data processing.
Good to have hands-on experience working with Perl or Python.
Hands on development using Spark / KDB / Greenplum platform will be a strong plus.
Designing, developing, maintaining and supporting Data Extract, Transform and Load (ETL) software using Informatica, Shell Scripts, DB2 UDB and Autosys.
Coming up with system architecture/re-design proposals for greater efficiency and ease of maintenance and developing software to turn proposals into implementations.

Need to work with business analysts and other project leads to understand requirements.
Strong collaboration and communication skills

Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹15L - ₹30L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+6 more

Responsibilities:

  • Ensure and own Data integrity across distributed systems.
  • Extract, Transform and Load data from multiple systems for reporting into BI platform.
  • Create Data Sets and Data models to build intelligence upon.
  • Develop and own various integration tools and data points.
  • Hands-on development and/or design within the project in order to maintain timelines.
  • Work closely with the Project manager to deliver on business requirements OTIF (on time in full)
  • Understand the cross-functional business data points thoroughly and be SPOC for all data-related queries.
  • Work with both Web Analytics and Backend Data analytics.
  • Support the rest of the BI team in generating reports and analysis
  • Quickly learn and use Bespoke & third party SaaS reporting tools with little documentation.
  • Assist in presenting demos and preparing materials for Leadership.

 Requirements:

  • Strong experience in Datawarehouse modeling techniques and SQL queries
  • A good understanding of designing, developing, deploying, and maintaining Power BI report solutions
  • Ability to create KPIs, visualizations, reports, and dashboards based on business requirement
  • Knowledge and experience in prototyping, designing, and requirement analysis
  • Be able to implement row-level security on data and understand application security layer models in Power BI
  • Proficiency in making DAX queries in Power BI desktop.
  • Expertise in using advanced level calculations on data sets
  • Experience in the Fintech domain and stakeholder management.
Read more
Digital B2B Platform
Agency job
via Jobdost by Sathish Kumar
Bengaluru (Bangalore)
6 - 12 yrs
₹60L - ₹80L / yr
skill iconJava
J2EE
skill iconSpring Boot
Hibernate (Java)
Problem solving
+3 more

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.

 

Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

● Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Digital B2B Platform
Bengaluru (Bangalore)
3 - 5 yrs
₹30L - ₹45L / yr
Data modeling
skill iconJava
Spring
Microservices
SQL

Looking for candidates only with tier 1 colleges OR have experience in a product-based company.


Desired Skills :

● Experience with data modeling and SQL/NoSQL databases

● Experience with distributed systems and microservices

● Good experience in working with any of Java/SpringBoot, GoLang or NodeJS

Excellent problem solving and debugging skills

● Passionate about the experience of software engineering as much as the output

● A strong sense of ownership

● Ability to communicate your ideas and approach to solving problems with clarity

Read more
Elastic

at Elastic

5 recruiters
Nikita Rathi
Posted by Nikita Rathi
Bengaluru (Bangalore)
4 - 8 yrs
₹14L - ₹30L / yr
Payment gateways
Payment processing
Compensation & Benefits
Compensation
HR analytics
+9 more
Elastic is a search company built on a free and open heritage. Anyone can use Elastic products and solutions to get started quickly and frictionlessly. Elastic offers three solutions for enterprise search, observability, and security, built on one technology stack that can be deployed anywhere. From finding documents to monitoring infrastructure to hunting for threats, Elastic makes data usable in real time and at scale. Thousands of organizations worldwide, including Cisco, eBay, Goldman Sachs, Microsoft, The Mayo Clinic, NASA, The New York Times, Wikipedia, and Verizon, use Elastic to power mission-critical systems. Founded in 2012, Elastic is a distributed company with Elasticians around the globe and is publicly traded on the NYSE under the symbol ESTC. Learn more at elastic.co.

We’re looking for Compensation Advisor to help drive our compensation philosophy and programs to support our overall people strategies.

What You Will Be Doing:
  • Support our Corporate functions with all elements of general compensation including job evaluation & classification within the Elastic job infrastructure
  • Create and present comprehensive analysis to assess compensation issues in the business and provide market competitive and business aligned solutions using internal and external benchmark data sources
  • Partner and provide thought leadership to HR Business Partners, Recruiting, and line management on the administration of all Elastic compensation programs and policies
  • Assist in the planning and management of annual compensation processes and programs (e.g. Annual Compensation Cycle)
  • Design and build compensation metrics, reports, and tools to inform compensation program decisions and forecast, report, and/or analyze compensation business outcomes
  • Partner with other Global Compensation team members and broader HR team in the design, development and maintenance of various compensation policies and programs
  • Evangelize Elastic’s Compensation and Total Rewards Strategy
What You Bring Along:
  • A real passion for fast-paced and dynamic environments, the ability to thrive in ambiguity and devote a diversified global view to all you do
  • Experience with compensation design and administration
  • A dedication to think big, use data to drive strategy, challenge convention, and potentially reinvent how work is done
  • A proficiency in optimizing compensation processes and programs with a keen understanding of the balance between structure and flexibility
  • The dexterity to balance critical, long term thinking about how we scale and optimize our rapidly growing employee base in tandem with the development and execution of global people strategies
And some of the basics:
  • Bachelor's Degree and 5+ years of related experience; or 3 years with a Master’s degree
  • 3+ years of experience in compensation analysis, financial analysis, statistical analysis, and/or data modeling is required
  • Demonstrable mastery with Excel and Google suite
  • Experience working with HR Business Partners or similar in a business facing role to solve compensation challenges.
  • Experience working in a hyper-growth, global organization and exposure to compensation programs at scale
  • Strategic, analytical, critical thinker who isn’t afraid to get into the details; real passion for problem solving
  • Excellent written and verbal communication and presentation skills
  • High degree of integrity and honesty; ability to exercise confidentiality and neutrality in complex and sensitive situations
  • A sense of humor and ability to roll with the punches definitely a plus.


Additional Information - We Take Care Of Our People

As a distributed company, diversity drives our identity. Whether you’re looking to launch a new career or grow an existing one, Elastic is the type of company where you can balance great work with great life. Your age is only a number. It doesn’t matter if you’re just out of college or your children are; we need you for what you can do.

We strive to have parity of benefits across regions and while regulations differ from place to place, we believe taking care of our people is the right thing to do.
  • Competitive pay based on the work you do here and not your previous salary
  • Health coverage for you and your family in many locations
  • Ability to craft your calendar with flexible locations and schedules for many roles
  • Generous number of vacation days each year
  • Double your charitable giving - We match up to $1500 (or local currency equivalent)
  • Up to 40 hours each year to use toward volunteer projects you love
  • Embracing parenthood with minimum of 16 weeks of parental leave
 
 
Read more
Bengaluru (Bangalore), UK
5 - 10 yrs
₹15L - ₹25L / yr
Data Visualization
PowerBI
ADF
Business Intelligence (BI)
PySpark
+11 more

Power BI Developer

Senior visualization engineer with 5 years’ experience in Power BI to develop and deliver solutions that enable delivery of information to audiences in support of key business processes. In addition, Hands-on experience on Azure data services like ADF and databricks is a must.

Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business, and technical counterparts.

Candidates should have worked in agile development environments.

Desired Competencies:

  • Should have minimum of 3 years project experience using Power BI on Azure stack.
  • Should have good understanding and working knowledge of Data Warehouse and Data Modelling.
  • Good hands-on experience of Power BI
  • Hands-on experience T-SQL/ DAX/ MDX/ SSIS
  • Data Warehousing on SQL Server (preferably 2016)
  • Experience in Azure Data Services – ADF, DataBricks & PySpark
  • Manage own workload with minimum supervision.
  • Take responsibility of projects or issues assigned to them
  • Be personable, flexible and a team player
  • Good written and verbal communications
  • Have a strong personality who will be able to operate directly with users
Read more
Bengaluru (Bangalore)
3 - 5 yrs
₹10L - ₹25L / yr
skill iconVue.js
skill iconAngularJS (1.x)
skill iconAngular (2+)
skill iconReact.js
skill iconJavascript
+10 more
Bachelor’s degree required.
Minimum Four years of experience.
Good for you to have –

Excellent knowledge of architectural/design patterns, data structures and algorithms
Expertise on performance tuning and optimizations.
You will definitely possess these technical skills –

Core skill set (must) : Core Java, Multi-threading, GC, J2EE technologies, REST
Core skill set (must) : RDBMS, Data Modeling, DB tuning
Working Knowledge (must): Server side implementation for highly concurrent and responsive systems.
Read more
Bengaluru (Bangalore)
4 - 10 yrs
₹15L - ₹22L / yr
SQL Azure
ADF
Business process management
Windows Azure
SQL
+12 more

Desired Competencies:

 

Ø  Expertise in Azure Data Factory V2

Ø  Expertise in other Azure components like Data lake Store, SQL Database, Databricks

Ø  Must have working knowledge of spark programming

Ø  Good exposure to Data Projects dealing with Data Design and Source to Target documentation including defining transformation rules

Ø  Strong knowledge of CICD Process

Ø  Experience in building power BI reports

Ø  Understanding of different components like Pipelines, activities, datasets & linked services

Ø  Exposure to dynamic configuration of pipelines using data sets and linked Services

Ø  Experience in designing, developing and deploying pipelines to higher environments

Ø  Good knowledge on File formats for flexible usage, File location Objects (SFTP, FTP, local, HDFS, ADLS, BLOB, Amazon S3 etc.)

Ø  Strong knowledge in SQL queries

Ø  Must have worked in full life-cycle development from functional design to deployment

Ø  Should have working knowledge of GIT, SVN

Ø  Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.

Ø  Should have working knowledge of different resources available in Azure like Storage Account, Synapse, Azure SQL Server, Azure Data Bricks, Azure Purview

Ø  Any experience related to metadata management, data modelling, and related tools (Erwin or ER Studio or others) would be preferred

 

Preferred Qualifications:

Ø  Bachelor's degree in Computer Science or Technology

Ø  Proven success in contributing to a team-oriented environment

Ø  Proven ability to work creatively and analytically in a problem-solving environment

Ø  Excellent communication (written and oral) and interpersonal skills

Qualifications

BE/BTECH

KEY RESPONSIBILITIES :

You will join a team designing and building a data warehouse covering both relational and dimensional models, developing reports, data marts and other extracts and delivering these via SSIS, SSRS, SSAS, and PowerBI. It is seen as playing a vital role in delivering a single version of the truth on Client’s data and delivering MI & BI that will feature in enabling both operational and strategic decision making.

You will be able to take responsibility for projects over the entire software lifecycle and work with minimum supervision. This would include technical analysis, design, development, and test support as well as managing the delivery to production.

The initial project being resourced is around the development and implementation of a Data Warehouse and associated MI/BI functions.

 

Principal Activities:

1.       Interpret written business requirements documents

2.       Specify (High Level Design and Tech Spec), code and write automated unit tests for new aspects of MI/BI Service.

3.       Write clear and concise supporting documentation for deliverable items.

4.       Become a member of the skilled development team willing to contribute and share experiences and learn as appropriate.

5.       Review and contribute to requirements documentation.

6.       Provide third line support for internally developed software.

7.       Create and maintain continuous deployment pipelines.

8.       Help maintain Development Team standards and principles.

9.       Contribute and share learning and experiences with the greater Development team.

10.   Work within the company’s approved processes, including design and service transition.

11.   Collaborate with other teams and departments across the firm.

12.   Be willing to travel to other offices when required.
13.You agree to comply with any reasonable instructions or regulations issued by the Company from time to time including those set out in the terms of the dealing and other manuals, including staff handbooks and all other group policies


Location
– Bangalore

 

Read more
Mobile Programming India Pvt Ltd

at Mobile Programming India Pvt Ltd

1 video
17 recruiters
Inderjit Kaur
Posted by Inderjit Kaur
Bengaluru (Bangalore), Chennai, Pune, Gurugram
4 - 8 yrs
₹9L - ₹14L / yr
ETL
Data Warehouse (DWH)
Data engineering
Data modeling
BRIEF JOB RESPONSIBILITIES:

• Responsible for designing, deploying, and maintaining analytics environment that processes data at scale
• Contribute design, configuration, deployment, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, enrichment, and loading of data into a variety of cloud data platforms, including AWS and Microsoft Azure
• Identify gaps and improve the existing platform to improve quality, robustness, maintainability, and speed
• Evaluate new and upcoming big data solutions and make recommendations for adoption to extend our platform to meet advanced analytics use cases, such as predictive modeling and recommendation engines
• Data Modelling , Data Warehousing on Cloud Scale using Cloud native solutions.
• Perform development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions

COMPETENCIES
• Experience building, maintaining, and improving Data Models / Processing Pipeline / routing in large scale environments
• Fluency in common query languages, API development, data transformation, and integration of data streams
• Strong experience with large dataset platforms such as (e.g. Amazon EMR, Amazon Redshift, AWS Lambda & Fargate, Amazon Athena, Azure SQL Database, Azure Database for PostgreSQL, Azure Cosmos DB , Data Bricks)
• Fluency in multiple programming languages, such as Python, Shell Scripting, SQL, Java, or similar languages and tools appropriate for large scale data processing.
• Experience with acquiring data from varied sources such as: API, data queues, flat-file, remote databases
• Understanding of traditional Data Warehouse components (e.g. ETL, Business Intelligence Tools)
• Creativity to go beyond current tools to deliver the best solution to the problem
Read more
Gurugram, Pune, Mumbai, Bengaluru (Bangalore), Chennai, Nashik
4 - 12 yrs
₹12L - ₹15L / yr
Data engineering
Data modeling
data pipeline
Data integration
Data Warehouse (DWH)
+12 more

 

 

Designation – Deputy Manager - TS


Job Description

  1. Total of  8/9 years of development experience Data Engineering . B1/BII role
  2. Minimum of 4/5 years in AWS Data Integrations and should be very good on Data modelling skills.
  3. Should be very proficient in end to end AWS Data solution design, that not only includes strong data ingestion, integrations (both Data @ rest and Data in Motion) skills but also complete DevOps knowledge.
  4. Should have experience in delivering at least 4 Data Warehouse or Data Lake Solutions on AWS.
  5. Should be very strong experience on Glue, Lambda, Data Pipeline, Step functions, RDS, CloudFormation etc.
  6. Strong Python skill .
  7. Should be an expert in Cloud design principles, Performance tuning and cost modelling. AWS certifications will have an added advantage
  8. Should be a team player with Excellent communication and should be able to manage his work independently with minimal or no supervision.
  9. Life Science & Healthcare domain background will be a plus

Qualifications

BE/Btect/ME/MTech

 

Read more
Venture Highway

at Venture Highway

3 recruiters
Nipun Gupta
Posted by Nipun Gupta
Bengaluru (Bangalore)
2 - 6 yrs
₹10L - ₹30L / yr
skill iconPython
Data engineering
Data Engineer
MySQL
skill iconMongoDB
+5 more
-Experience with Python and Data Scraping.
- Experience with relational SQL & NoSQL databases including MySQL & MongoDB.
- Familiar with the basic principles of distributed computing and data modeling.
- Experience with distributed data pipeline frameworks like Celery, Apache Airflow, etc.
- Experience with NLP and NER models is a bonus.
- Experience building reusable code and libraries for future use.
- Experience building REST APIs.

Preference for candidates working in tech product companies
Read more
Servian

at Servian

2 recruiters
sakshi nigam
Posted by sakshi nigam
Bengaluru (Bangalore)
2 - 8 yrs
₹10L - ₹25L / yr
Data engineering
ETL
Data Warehouse (DWH)
Powershell
DA
+7 more
Who we are
 
We are a consultant led organisation. We invest heavily in our consultants to ensure they have the technical skills and commercial acumen to be successful in their work.
 
Our consultants have a passion for data and solving complex problems. They are curious, ambitious and experts in their fields. We have developed a first rate team so you will be supported and learn from the best

About the role

  • Collaborating with a team of like-minded and experienced engineers for Tier 1 customers, you will focus on data engineering on large complex data projects. Your work will have an impact on platforms that handle crores of customers and millions of transactions daily.

  • As an engineer, you will use the latest cloud services to design and develop reusable core components and frameworks to modernise data integrations in a cloud first world and own those integrations end to end working closely with business units. You will design and build for efficiency, reliability, security and scalability. As a consultant, you will help drive a data engineering culture and advocate best practices.

Mandatory experience

    • 1-6 years of relevant experience
    • Strong SQL skills and data literacy
    • Hands-on experience designing and developing data integrations, either in ETL tools, cloud native tools or in custom software
    • Proficiency in scripting and automation (e.g. PowerShell, Bash, Python)
    • Experience in an enterprise data environment
    • Strong communication skills

Desirable experience

    • Ability to work on data architecture, data models, data migration, integration and pipelines
    • Ability to work on data platform modernisation from on-premise to cloud-native
    • Proficiency in data security best practices
    • Stakeholder management experience
    • Positive attitude with the flexibility and ability to adapt to an ever-changing technology landscape
    • Desire to gain breadth and depth of technologies to support customer's vision and project objectives

What to expect if you join Servian?

    • Learning & Development: We invest heavily in our consultants and offer internal training weekly (both technical and non-technical alike!) and abide by a ‘You Pass We Pay” policy.
    • Career progression: We take a longer term view of every hire. We have a flat org structure and promote from within. Every hire is developed as a future leader and client adviser.
    • Variety of projects: As a consultant, you will have the opportunity to work across multiple projects across our client base significantly increasing your skills and exposure in the industry.
    • Great culture: Working on the latest Apple MacBook pro in our custom designed offices in the heart of leafy Jayanagar, we provide a peaceful and productive work environment close to shops, parks and metro station.
    • Professional development: We invest heavily in professional development both technically, through training and guided certification pathways, and in consulting, through workshops in client engagement and communication. Growth in our organisation happens from the growth of our people.
Read more
Kaleidofin

at Kaleidofin

3 recruiters
Poornima B
Posted by Poornima B
Chennai, Bengaluru (Bangalore)
2 - 4 yrs
Best in industry
PowerBI
Business Intelligence (BI)
skill iconPython
Tableau
SQL
+1 more
We are looking for a developer to design and deliver strategic data-centric insights leveraging the next generation analytics and BI technologies. We want someone who is data-centric and insight-centric, less report centric. We are looking for someone wishing to make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.

Responsibilities:
  • Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
  • Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
  • Become an expert on data and trends, both internal and external to Kaleidofin.
  • Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
  • Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
  • Automate scheduling and distribution of reports and support auditing and value realization.
  • Partner with enterprise architects to define and ensure proposed.
  • Business Intelligence solutions adhere to an enterprise reference architecture.
  • Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks.
 Requirements:
  • Experience leading development efforts through all phases of SDLC.
  • 2+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
  • Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
  • Hands on experience in SQL, data management, and scripting (preferably Python).
  • Strong data visualisation design skills, data modeling and inference skills.
  • Hands-on and experience in managing small teams.
  • Financial services experience preferred, but not mandatory.
  • Strong knowledge of architectural principles, tools, frameworks, and best practices.
  • Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
  • Preferred candidates with less than 30 days notice period.
Read more
Marktine

at Marktine

1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
2 - 10 yrs
₹5L - ₹15L / yr
skill iconMachine Learning (ML)
RNN
DNN
Data modeling
Data Visualization
+2 more
What you will do?
- Understand the business drivers and analytical use-cases.
- Translate use cases to data models, descriptive, analytical, predictive, and engineering outcomes.
- Explore new technologies and learn new techniques to solve business problems creatively
- Think big! and drive the strategy for better data quality for the customers.
- Become the voice of business within engineering and of engineering within the business with customers.
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time

What we're looking for :
- Hands-on experience in data modeling, data visualization, and pipeline design and development
- Hands-on exposure to Machine learning concepts like supervised learning, unsupervised learning, RNN, DNN.
- Prior experience working with business stakeholders, in an enterprise space is a plus
- Great communication skills. You should be able to directly communicate with senior business leaders, embed yourself with business teams, and present solutions to business stakeholders
- Experience in working independently and driving projects end to end, strong analytical skills.
Read more
Telyport technologies pvt ltd
Bengaluru (Bangalore)
2 - 7 yrs
Best in industry
skill iconPython
skill iconAngular (2+)
skill iconNodeJS (Node.js)
skill iconMongoDB
skill iconReact.js
+1 more
We are seeking rockstar fullstack developers who can expand our product portfolio and make life easier for locals and local businesses. As part of this role, you will build experiences that help users engage and feel comfortable with the products and services we offer.

Why Telyport?:
Be part of an early-stage emerging technology startup.
Opportunity to collaborate and learn cross-functional areas.
A broad thinking team that encourages and supports bringing your ideas into life.

If you are looking for a career boost (or) trying to follow the trend of learning languages without a clear understanding of the problem you solve (or) intend to be part of an ecosystem that guarantees routine and mundane tasks. This is not for you

If you wanna be a part of our team, you should be willing to understand, adapt, be responsible and achieve.

Here are a few must-haves before your profile can be shortlisted.

Must have:
- Strong HTML, CSS skills and knowledge of at least one javascript framework.... react or preferably angular.
- Strong pythonic skills
- Strong NoSQL foundation
- Caching fundamentals
- Strong logical and reasoning skills
- Data modeling skills

If we find your profile interesting, expect a call real soon.

Also, the note field provided makes a huge impact on our decision
Read more
Bengaluru (Bangalore)
3 - 7 yrs
₹7L - ₹15L / yr
Business Intelligence (BI)
Data modeling
PowerBI
Microsoft SSRS
Microsoft Business Intelligence (MSBI)
+1 more
Hi All
Greetings from CareerNet Technologies !
 
 Its pleasure talking to you.
Please find below the details:
Please find below the details:
Role: Power BI Developer
Company: KOCH (https://www.kochind.com/" target="_blank">https://www.kochind.com)
Type: Permanent (Direct payroll)
Edu: Any Full time Graduates 
Exp : 4+ Yrs
Job Location:  Kundalahalli,Near Brookefield Hospital, Bangalore -560037
And as discussed, PFA the JDs and company details and it's principles.

Job Description: 

  • 3+ years’ experience developing and implementing enterprise-scale reports and dashboards.

  • Proficiency with MS Power BI / SSRS.

  • Knowledge of logical and physical data modeling concepts (relational and dimensional).

  • Understanding of structured query language (SQL).

Read more
Increasingly Technologies
Bengaluru (Bangalore)
7 - 8 yrs
₹12L - ₹14L / yr
Data modeling
Stored Procedures
SQL server
Reporting
SQL Server Integration Services (SSIS)
+1 more
About Increasingly:Increasingly is an award-winning, fast-growing retail technology company focused specifically on the automation of cross-selling for online retailers. Our clients include large global corporations like Samsung & Canon to several small to medium size retailers across the globe. Our AI-based algorithms help a customer buying a TV on Samsung to find the matching soundbar & purchase both together. Increasingly is headquartered in London with offices in Lisbon & Bangalore. We work with clients in over 30 countries & 20 languages.We are looking to rapidly expand our technology & product development operations in India. And we need smart, ambitious people like you who enjoy a fun yet challenging work environment. We believe strongly that diversity & inclusion are the foundations for a lasting, incredible culture. We also believe that it's important to get the balance right between work & life.The job and its impact:- Be solely responsible for designing databases and ensuring their stability, reliability, and performance. Ensure that the database system is efficient and error-free. - Optimize your queries when a website is too slow due to database performance, write queries for heavy traffic databases, and design tables and indexes for the best database architecture and performance.- Lead a team of developers and successfully solve issues and make improvements to databases, while ensuring all work meets the necessary requirements.- Learn new technology. There's a ton of stuff in there that falls outside the classical SQL model, and learning how to use it will improve your craft.- Find a problem to solve and don- t be afraid to fail. Learn by making mistakes.- Be fluent in the current technology and able to learn how to correctly utilize new technology, often on the fly. As a result, you are must be the most versatile member of any team.- Everything changes so fast the fundamentals you learned five or 5 years ago may no longer be applicable. Architect skills. Developers must be architects these days. Understand the big picture of mapping. Try to find the next set of standards.What essential skills you need:- Schema design, data modeling, data upgrades, develop distributed and highly scalable applications with a focus on scalability performance availability reliability security and maintainability of the applications or technologies.- Code review, SQL query tuning, SQL patches, ownership, attention to detail ad-hoc queries & automation in support of internal and external customers data exchange and integration needs.- Research, troubleshoot, and resolve data issues impacting extract delivery, application design enhancements, development, debugging, and optimization with respect to data concerns and new product features.- Cloud-managed database engines such as Amazon RDS. NoSQL solutions such as Redis, Cassandra, MongoDB, make regular recommendations for system improvements.- Extensive experience with backups monitoring performance tuning and troubleshooting tools & techniques. Ability to identify bad SQL statements & optimize them.- Support QA & development team in project queries, test plans and test cases to improve the database with an ability to communicate effectively within a team.- Self-starter collects data, analyzes the collected information, designs algorithms, draws flowcharts and then implements code for the logic developed through algorithms and flowcharts.What are the benefits:- You'll get to work in one of the hottest & fastest growing retail technologies in Europe right now. - You'll get paid a competitive salary & be working directly with a super experienced team of people. - You'll get a great place to come to work every day. Varied, complex, challenging & with a great culture that you can shape & change.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort