Cutshort logo
Delivery Solutions logo
Senior Data Engineer - SQL BI (4- 13 yrs) (SQL, SSIS, SSAS exp)
Senior Data Engineer - SQL BI (4- 13 yrs) (SQL, SSIS, SSAS exp)
Delivery Solutions's logo

Senior Data Engineer - SQL BI (4- 13 yrs) (SQL, SSIS, SSAS exp)

TA Team's profile picture
Posted by TA Team
5 - 13 yrs
₹15L - ₹27L / yr
Chennai
Skills
skill iconPython
SQL
SQL Server Integration Services (SSIS)
SSAS
SQL Server Analysis Services (SSAS)
PowerBI
Windows Azure
Data Warehouse (DWH)


About UPS:

Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPS’s India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics.


‘Future You’ grows as a visible and valued Technology professional with UPS, driving us towards an exciting tomorrow. As a global Technology organization we can put serious resources behind your development. If you are solutions orientated, UPS Technology is the place for you. ‘Future You’ delivers ground-breaking solutions to some of the biggest logistics challenges around the globe. You’ll take technology to unimaginable places and really make a difference for UPS and our customers.


Job Summary:

The Senior Data Engineer - SQL BI supervises and participates in the development of batch and real-time data pipelines utilizing various data analytics processing frameworks in support of Business Intelligence (BI), Data Science, and web application products. This position participates in and supports the integration of data from various data sources, both internal and external. This position performs extract, transform, load (ETL) data conversions, and facilitates data cleansing and enrichment. This position performs full systems life cycle management activities, such as analysis, technical requirements, design, coding, testing, implementation of systems and applications software. This position participates and contributes to synthesizing disparate data sources to create reusable and reproducible data assets. This position contributes to application development through semantic and analytical modeling.


REQUIREMENTS

  • 4 plus years of relevant professional experience
  • In-depth experience with both on-premises SQL Server (SQL, SSIS, SSAS)
  • Some experience in Azure (Databricks, Data Factory, Apache Spark, Python)
  • Familiarity with Delta lake, Unity Catalog concepts in Databricks
  • Demonstrated awareness of Data Warehouse concepts (Star Schema) and methodologies
  • Experience with different types of feed (XML, JSON, etc.)
  • Familiarity with Data Visualization tools (Power BI is preferred)
  • Experience working within Agile Frameworks
  • .NET experience is preferred
  • Proficiency in writing Python or Java or C# is preferred.

Additional Information This role will be in-office 3 days a week in Chennai, India

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Delivery Solutions

Founded :
2015
Type
Size
Stage :
Profitable
About

Out-of-the-box solutions are provided by Delivery Solutions to retailers, allowing them to provide customer experiences such as curbside delivery, same-day delivery, shipping, in-store pickup, and post-purchase pickup. The company collaborates with some of the most recognizable names in the retail industry, such as Michael's, Sephora, Loblaw, GameStop, Office Depot, Sally Beauty, Total Wine, Belk, and Abercrombie & Fitch.


Its SAAS-based solution is incredibly adjustable and works in combination with e-commerce sites, warehouse management systems, order management systems, and point-of-sale systems to give a highly scalable experience and a base of delighted customers. They have direct connections to the most prominent businesses in same-day delivery, like DoorDash, Uber, Postmates, and Shipt, amongst others, in addition to the most prominent shipping firms, including UPS, FedEx, USPS, and others.


Perks & Benefits @Delivery Solutions: 

  • Permanent Remote work - (Work from anywhere)
  • Broadband reimbursement
  • Flexi work hours - (Login/Logout flexibility)
  • 21 Paid leaves in a year (Jan to Dec) and 7 COVID leaves
  • Two appraisal cycles in a year
  • Encashment of unused leaves on Gross
  • RNR - Amazon Gift Voucher
  • Employee Referral Bonus
  • Technical & Soft skills training
  • Sodexo meal card
  • Surprise on birthday/ service anniversary/new baby/wedding gifts
  • Annual trip 
Read more
Company video
Delivery Solutions's video section
Delivery Solutions's video section
Photos
Company featured pictures
Company featured pictures
Company featured pictures
Connect with the team
Profile picture
Ayyappan Paramasivam
Profile picture
Pranali Salvi
Profile picture
Digvijay Singh
Profile picture
Rini Chakravarty
Company social profiles
linkedintwitter

Similar jobs

Marktine
at Marktine
1 recruiter
Vishal Sharma
Posted by Vishal Sharma
Remote, Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹20L / yr
Big Data
Spark
PySpark
Data engineering
Data Warehouse (DWH)
+5 more

Azure – Data Engineer

  • At least 2 years hands on experience working with an Agile data engineering team working on big data pipelines using Azure in a commercial environment.
  • Dealing with senior stakeholders/leadership
  • Understanding of Azure data security and encryption best practices. [ADFS/ACLs]

Data Bricks –experience writing in and using data bricks Using Python to transform, manipulate data.

Data Factory – experience using data factory in an enterprise solution to build data pipelines. Experience calling rest APIs.

Synapse/data warehouse – experience using synapse/data warehouse to present data securely and to build & manage data models.

Microsoft SQL server – We’d expect the candidate to have come from a SQL/Data background and progressed into Azure

PowerBI – Experience with this is preferred

Additionally

  • Experience using GIT as a source control system
  • Understanding of DevOps concepts and application
  • Understanding of Azure Cloud costs/management and running platforms efficiently
Read more
client of peoplefirst consultants
client of peoplefirst consultants
Agency job
via People First Consultants by Aishwarya KA
Remote, Chennai
3 - 6 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
skill iconDeep Learning
Artificial Intelligence (AI)
skill iconPython
+1 more

Skills: Machine Learning,Deep Learning,Artificial Intelligence,python.

Location:Chennai


Domain knowledge:
Data cleaning, modelling, analytics, statistics, machine learning, AI

Requirements:

·         To be part of Digital Manufacturing and Industrie 4.0 projects across Saint Gobain group of companies

·         Design and develop AI//ML models to be deployed across SG factories

·         Knowledge on Hadoop, Apache Spark, MapReduce, Scala, Python programming, SQL and NoSQL databases is required

·         Should be strong in statistics, data analysis, data modelling, machine learning techniques and Neural Networks

·         Prior experience in developing AI and ML models is required

·         Experience with data from the Manufacturing Industry would be a plus

Roles and Responsibilities:

·         Develop AI and ML models for the Manufacturing Industry with a focus on Energy, Asset Performance Optimization and Logistics

·         Multitasking, good communication necessary

·         Entrepreneurial attitude.

 
Read more
Product Engineering MNC (FinTech Domain)
Product Engineering MNC (FinTech Domain)
Agency job
via Exploro Solutions by jisha Alex
Bengaluru (Bangalore)
5 - 10 yrs
₹10L - ₹40L / yr
skill iconData Science
Data Scientist
skill iconPython
Statistical Modeling
Statistical Analysis
+2 more

Role : Sr Data Scientist / Tech Lead – Data Science

Number of positions : 8

Responsibilities

  • Lead a team of data scientists, machine learning engineers and big data specialists
  • Be the main point of contact for the customers
  • Lead data mining and collection procedures
  • Ensure data quality and integrity
  • Interpret and analyze data problems
  • Conceive, plan and prioritize data projects
  • Build analytic systems and predictive models
  • Test performance of data-driven products
  • Visualize data and create reports
  • Experiment with new models and techniques
  • Align data projects with organizational goals

Requirements (please read carefully)

  • Very strong in statistics fundamentals. Not all data is Big Data. The candidate should be able to derive statistical insights from very few data points if required, using traditional statistical methods.
  • Msc-Statistics/ Phd.Statistics
  • Education – no bar, but preferably from a Statistics academic background (eg MSc-Stats, MSc-Econometrics etc), given the first point
  • Strong expertise in Python (any other statistical languages/tools like R, SAS, SPSS etc are just optional, but Python is absolutely essential). If the person is very strong in Python, but has almost nil knowledge in the other statistical tools, he/she will still be considered a good candidate for this role.
  • Proven experience as a Data Scientist or similar role, for about 7-8 years
  • Solid understanding of machine learning and AI concepts, especially wrt choice of apt candidate algorithms for a use case, and model evaluation.
  • Good expertise in writing SQL queries (should not be dependent upon anyone else for pulling in data, joining them, data wrangling etc)
  • Knowledge of data management and visualization techniques --- more from a Data Science perspective.
  • Should be able to grasp business problems, ask the right questions to better understand the problem breadthwise /depthwise, design apt solutions, and explain that to the business stakeholders.
  • Again, the last point above is extremely important --- should be able to identify solutions that can be explained to stakeholders, and furthermore, be able to present them in simple, direct language.

 http://www.altimetrik.com/">http://www.altimetrik.com

https://www.youtube.com/watch?v=3nUs4YxppNE&;feature=emb_rel_end">https://www.youtube.com/watch?v=3nUs4YxppNE&;feature=emb_rel_end

https://www.youtube.com/watch?v=e40r6kJdC8c">https://www.youtube.com/watch?v=e40r6kJdC8c

Read more
Celebal Technologies Pvt Ltd
at Celebal Technologies Pvt Ltd
2 candid answers
Anjani Upadhyay
Posted by Anjani Upadhyay
Jaipur
3 - 6 yrs
₹10L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
SQL
NOSQL Databases
+6 more

Job Description: 

An Azure Data Engineer is responsible for designing, implementing, and maintaining pipelines and ETL/ ELT flow solutions on the Azure cloud platform. This role requires a strong understanding of migration database technologies and the ability to deploy and manage database solutions in the Azure cloud environment.

 

Key Skills:

·      Min. 3+ years of Experience with data modeling, data warehousing, and building ETL pipelines.

·      Must have a firm knowledge of SQL, NoSQL, SSIS SSRS, and ETL/ELT Concepts.

·      Should have hands-on experience in Databricks, ADF (Azure Data Factory), ADLS, Cosmos DB.

·      Excel in the design, creation, and management of very large datasets

·      Detailed knowledge of cloud-based data warehouses, architecture, infrastructure components, ETL, and reporting analytics tools and environments.

·      Skilled with writing, tuning, and troubleshooting SQL queries

·      Experience with Big Data technologies such as Data storage, Data mining, Data analytics, and Data visualization.

·      Should be familiar with programming and should be able to write and debug the code in any of the programming languages like Node, Python, C#, .Net, Java.

 

Technical Expertise and Familiarity:

  • Cloud Technologies: Azure (ADF, ADB, Logic Apps, Azure SQL database, Azure Key Vaults, ADLS, Synapse)
  • Database: CosmosDB, Document DB  
  • IDEs: Visual Studio, VS Code, MS SQL Server
  • Data Modelling,ELT, ETL Methodology

 

 

 

 

 

 

Read more
Quicken Inc
at Quicken Inc
2 recruiters
Shreelakshmi M
Posted by Shreelakshmi M
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
ETL QA
+1 more
  • Graduate+ in Mathematics, Statistics, Computer Science, Economics, Business, Engineering or equivalent work experience.
  • Total experience of 5+ years with at least 2 years in managing data quality for high scale data platforms.
  • Good knowledge of SQL querying.
  • Strong skill in analysing data and uncovering patterns using SQL or Python.
  • Excellent understanding of data warehouse/big data concepts such data extraction, data transformation, data loading (ETL process).
  • Strong background in automation and building automated testing frameworks for data ingestion and transformation jobs.
  • Experience in big data technologies a big plus.
  • Experience in machine learning, especially in data quality applications a big plus.
  • Experience in building data quality automation frameworks a big plus.
  • Strong experience working with an Agile development team with rapid iterations. 
  • Very strong verbal and written communication, and presentation skills.
  • Ability to quickly understand business rules.
  • Ability to work well with others in a geographically distributed team.
  • Keen observation skills to analyse data, highly detail oriented.
  • Excellent judgment, critical-thinking, and decision-making skills; can balance attention to detail with swift execution.
  • Able to identify stakeholders, build relationships, and influence others to get work done.
  • Self-directed and self-motivated individual who takes complete ownership of the product and its outcome.
Read more
They provide both wholesale and retail funding. PM1
They provide both wholesale and retail funding. PM1
Agency job
via Multi Recruit by Sapna Deb
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
ETL
Talend
OLAP
Data governance
SQL
+8 more
  • Key responsibility is to design and develop a data pipeline including the architecture, prototyping, and development of data extraction, transformation/processing, cleansing/standardizing, and loading in Data Warehouse at real-time/near the real-time frequency. Source data can be structured, semi-structured, and/or unstructured format.
  • Provide technical expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Development of complex data transformation using Talend (BigData edition), Python/Java transformation in Talend, SQL/Python/Java UDXs, AWS S3, etc to load in OLAP Data Warehouse in Structured/Semi-structured form
  • Development of data model and creating transformation logic to populate models for faster data consumption with simple SQL.
  • Implementing automated Audit & Quality assurance checks in Data Pipeline
  • Document & maintain data lineage to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Requirements

  • Programming experience using Python / Java, to create functions / UDX
  • Extensive technical experience with SQL on RDBMS (Oracle/MySQL/Postgresql etc) including code optimization techniques
  • Strong ETL/ELT skillset using Talend BigData Edition. Experience in Talend CDC & MDM functionality will be an advantage.
  • Experience & expertise in implementing complex data pipelines, including semi-structured & unstructured data processing
  • Expertise to design efficient data ingestion solutions to consolidate data from RDBMS, APIs, Messaging queues, weblogs, images, audios, documents, etc of  Enterprise Applications, SAAS applications, external 3rd party sites or APIs, etc through ETL/ELT, API integrations, Change Data Capture, Robotic Process Automation, Custom Python/Java Coding, etc
  • Good understanding & working experience in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) and cloud-native Data Lake (S3, ADLS, BigQuery, etc) solutions
  • Familiarity with AWS tool stack for Storage & Processing. Able to recommend the right tools/solutions available to address a technical problem
  • Good knowledge of database performance and tuning, troubleshooting, query optimization, and tuning
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Good knowledge of Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Data Governance & Quality Assurance
  • Distributed computing
  • Linux
  • Data structures and algorithm
  • Unstructured Data Processing
Read more
web app platform or data science consulting
web app platform or data science consulting
Agency job
via Myna Solutions by Venkat B
Hyderabad
0 - 0 yrs
₹3L - ₹5L / yr
skill iconData Science
Data Scientist
skill iconPython
skill iconR Programming
Data science Freshers with a Product & IT service based company

Qualifications
B.Tech/M.Tech
Percentage 70% and above
2018 & 2019 passouts
At least 3 POC Implementations should have done
Premium Institutes passouts are more preferrable
Read more
Digital Aristotle
at Digital Aristotle
2 recruiters
Digital Aristotle
Posted by Digital Aristotle
Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹15L / yr
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
skill iconPython

JD : ML/NLP Tech Lead

- We are looking to hire an ML/NLP Tech lead who can own products for a technology perspective and manage a team of up to 10 members. You will play a pivotal role in re-engineering our products, transformation, and scaling of AssessEd

WHAT ARE WE BUILDING :

- A revolutionary way of providing continuous assessments of a child's skill and learning, pointing the way to the child's potential in the future. This as opposed to the traditional one-time, dipstick methodology of a test that hurriedly bundles the child into a slot, that in-turn - declares- the child to be fit for a career in a specific area or a particular set of courses that would perhaps get him somewhere. At the core of our system is a lot of data - both structured and unstructured. 

 

- We have books and questions and web resources and student reports that drive all our machine learning algorithms. Our goal is to not only figure out how a child is coping but to also figure out how to help him by presenting relevant information and questions to him in topics that he is struggling to learn.

Required Skill sets :

- Wisdom to know when to hustle and when to be calm and dig deep. Strong can do mentality, who is joining us to build on a vision, not to do a job.

- A deep hunger to learn, understand, and apply your knowledge to create technology.

- Ability and Experience tackling hard Natural Language Processing problems, to separate wheat from the chaff, knowledge of mathematical tools to succinctly describe the ideas to implement them in code.

- Very Good understanding of Natural Language Processing and Machine Learning with projects to back the same.

- Strong fundamentals in Linear Algebra, Probability and Random Variables, and Algorithms.

- Strong Systems experience in Distributed Systems Pipeline: Hadoop, Spark, etc.

- Good knowledge of at least one prototyping/scripting language: Python, MATLAB/Octave or R.

- Good understanding of Algorithms and Data Structures.

- Strong programming experience in C++/Java/Lisp/Haskell.

- Good written and verbal communication.

Desired Skill sets :

- Passion for well-engineered product and you are - ticked off- when something engineered is off and you want to get your hands dirty and fix it.

- 3+ yrs of research experience in Machine Learning, Deep Learning and NLP

- Top tier peer-reviewed research publication in areas like Algorithms, Computer Vision/Image Processing, Machine Learning or Optimization (CVPR, ICCV, ICML, NIPS, EMNLP, ACL, SODA, FOCS etc)

- Open Source Contribution (include the link to your projects, GitHub etc.)

- Knowledge of functional programming.

- International level participation in ACM ICPC, IOI, TopCoder, etc

 

- International level participation in Physics or Math Olympiad

- Intellectual curiosity about advanced math topics like Theoretical Computer Science, Abstract Algebra, Topology, Differential Geometry, Category Theory, etc.

What can you expect :

- Opportunity to work on the interesting and hard research problem, to see the real application of state-of-the-art research into practice.

- Opportunity to work on important problems with big social impact: Massive, and direct impact of the work you do on the lives of students.

- An intellectually invigorating, phenomenal work environment, with massive ownership and growth opportunities.

- Learn effective engineering habits required to build/deploy large production-ready ML applications.

- Ability to do quick iterations and deployments.

- We would be excited to see you publish papers (though certain restrictions do apply).

Website : http://Digitalaristotle.ai


Work Location: - Bangalore

Read more
Oil & Energy Industry
Oil & Energy Industry
Agency job
via Green Bridge Consulting LLP by Susmita Mishra
NCR (Delhi | Gurgaon | Noida)
1 - 3 yrs
₹8L - ₹12L / yr
skill iconMachine Learning (ML)
skill iconData Science
skill iconDeep Learning
Digital Signal Processing
Statistical signal processing
+6 more
Understanding business objectives and developing models that help to achieve them,
along with metrics to track their progress
Managing available resources such as hardware, data, and personnel so that deadlines
are met
Analysing the ML algorithms that could be used to solve a given problem and ranking
them by their success probability
Exploring and visualizing data to gain an understanding of it, then identifying
differences in data distribution that could affect performance when deploying the model
in the real world
Verifying data quality, and/or ensuring it via data cleaning
Supervising the data acquisition process if more data is needed
Defining validation strategies
Defining the pre-processing or feature engineering to be done on a given dataset
Defining data augmentation pipelines
Training models and tuning their hyper parameters
Analysing the errors of the model and designing strategies to overcome them
Deploying models to production
Read more
Quantiphi Inc.
at Quantiphi Inc.
1 video
10 recruiters
Bhavisha Mansukhani
Posted by Bhavisha Mansukhani
Mumbai
1 - 5 yrs
₹3L - ₹15L / yr
skill iconData Science
Decision Science
Data modeling
Statistical Modeling
skill iconPython
+3 more
About us: Quantiphi is a category defining Data Science and Machine Learning Software and Services Company focused on helping organizations translate the big promise of Big Data & Machine Learning technologies into quantifiable business impact. We were founded on the belief that machine learning and artificial intelligence are transformative technologies that will create the next quantum gain in customer experience and unit economics of businesses. Quantiphi helps clients find and capture hidden value from data through a unique blend of business acumen, big-data, machine learning and intuitive information design. AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries. Please Find Attached fact sheet for your reference. For more details visit: www.quantiphi.com ; www.athenasowl.tv Job Description: -Developing high-level solution architecture related to different use-cases in the media industry -Leveraging both structured and unstructured data from external sources and our proprietary AI/ML models to build solutions and workflows that can be used to give data driven insights. -Develop sophisticated yet easy to digest interpretations and communicate insights to clients that lead to quantifiable business impact. -Building deep relationship with clients by understanding their stated but more importantly, latent needs. -Working closely with the client-side delivery managers to ensure a seamless communication and delivery cadence. Essential Skills and Qualifications: -Hands-on experience with statistical tools and techniques in Python -Great analytical skills, with expertise in analytical toolkits such as Logistic Regression, Cluster Analysis, Factor Analysis, Multivariate Regression, Statistical modelling, predictive analysis. -Advanced knowledge of supervised and unsupervised machine learning algorithms like Random Forest, Boosting, SVM, Neural Networks, Collaborative filtering etc. -Ability to think creatively and work well both as part of a team and as an individual contributor -Critical eye for the quality of data and strong desire to get it right -Strong communication skills. -Should be able to read a paper and quickly implement ideas from scratch. -A pleasantly forceful personality and charismatic communication style.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos