Cutshort logo
Wallero technologies logo
Data Engineer - SQL, DATA MODELLING, ADF & POWERBI
Data Engineer - SQL, DATA MODELLING, ADF & POWERBI
Wallero technologies's logo

Data Engineer - SQL, DATA MODELLING, ADF & POWERBI

Nikitha Muthuswamy's profile picture
Posted by Nikitha Muthuswamy
7 - 15 yrs
₹20L - ₹28L / yr
Hyderabad
Skills
SQL
Data modeling
ADF
PowerBI
  1. Strong communication skills are essential, as the selected candidate will be responsible for leading a team of two in the future.
  2. Proficiency in SQL.
  3. Expertise in Data Modelling.
  4. Experience with Azure Data Factory (ADF).
  5. Competence in Power BI.
  6. SQL – Should be strong in Data Modeling , Tables Design and SQL Queries.
  7. ADF – Must have hands-on experience in ADF pipelines and its set-up from End-to-End in Azure including subscriptions, IR and Resource Group creations.
  8. Power BI – Hands-on knowledge in Power BI reports including documentation and follow existing standards.


Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Wallero technologies

Founded :
2007
Type
Size :
100-1000
Stage :
Profitable
About

Wallero is a global leader in providing business solutions, IT services, consulting; with a large network of innovation & delivery centers. Know more!

Read more
Connect with the team
Profile picture
Nikitha Muthuswamy
Profile picture
Satya Gopaal
Profile picture
Priya Karunakaran
Profile picture
Abilash Perumandla
Profile picture
Keerthana M
Company social profiles
linkedinfacebook

Similar jobs

Gipfel & Schnell Consultings Pvt Ltd
TanmayaKumar Pattanaik
Posted by TanmayaKumar Pattanaik
Bengaluru (Bangalore)
3 - 9 yrs
₹9L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+10 more

Qualifications & Experience:


▪ 2 - 4 years overall experience in ETLs, data pipeline, Data Warehouse development and database design

▪ Software solution development using Hadoop Technologies such as MapReduce, Hive, Spark, Kafka, Yarn/Mesos etc.

▪ Expert in SQL, worked on advanced SQL for at least 2+ years

▪ Good development skills in Java, Python or other languages

▪ Experience with EMR, S3

▪ Knowledge and exposure to BI applications, e.g. Tableau, Qlikview

▪ Comfortable working in an agile environment

Read more
Carsome
at Carsome
3 recruiters
Piyush Palkar
Posted by Piyush Palkar
Remote, Kuala Lumpur
1 - 6 yrs
₹10L - ₹30L / yr
skill iconData Science
skill iconMachine Learning (ML)
skill iconPython
SQL
Problem solving
+4 more

Carsome’s Data Department is on the lookout for a Data Scientist/Senior Data Scientist who has a strong passion in building data powered products.

 

Data Science function under the Data Department has a responsibility for standardisation of methods, mentoring team of data science resources/interns, including code libraries and documentation, quality assurance of outputs, modeling techniques and statistics, leveraging a variety of technologies, open-source languages, and cloud computing platform. 

 

You will get to lead & implement projects such as price optimization/prediction, enabling iconic personalization experiences for our customer, inventory optimization etc.

 

Job Descriptions

 

  • Identifying and integrating datasets that can be leveraged through our product and work closely with data engineering team to develop data products.
  • Execute analytical experiments methodically to help solve various problems and make a true impact across functions such as operations, finance, logistics, marketing. 
  • Identify, prioritize, and design testing opportunities that will inform algorithm enhancements. 
  • Devise and utilize algorithms and models to mine big data stores, perform data and error analysis to improve models and clean and validate data for uniformity and accuracy.
  • Unlock insights by analyzing large amounts of complex website traffic and transactional data. 
  • Implement analytical models into production by collaborating with data analytics engineers.

 

Technical Requirements

 

  • Expertise in model design, training, evaluation, and implementation ML Algorithm expertise K-nearest neighbors, Random Forests, Naive Bayes, Regression Models. PyTorch, TensorFlow, Keras, deep learning expertise, tSNE, gradient boosting expertise, regression implementation expertise, Python, Pyspark, SQL, R, AWS Sagemaker /personalize etc.
  • Machine Learning / Data Science Certification

 

Experience & Education 

 

  • Bachelor’s in Engineering / Master’s in Data Science  / Postgraduate Certificate in Data Science. 
Read more
Career Forge
at Career Forge
2 candid answers
Mohammad Faiz
Posted by Mohammad Faiz
Delhi, Gurugram, Noida, Ghaziabad, Faridabad
5 - 7 yrs
₹12L - ₹15L / yr
skill iconPython
Apache Spark
PySpark
Data engineering
ETL
+10 more

🚀 Exciting Opportunity: Data Engineer Position in Gurugram 🌐


Hello 


We are actively seeking a talented and experienced Data Engineer to join our dynamic team at Reality Motivational Venture in Gurugram (Gurgaon). If you're passionate about data, thrive in a collaborative environment, and possess the skills we're looking for, we want to hear from you!


Position: Data Engineer  

Location: Gurugram (Gurgaon)  

Experience: 5+ years 


Key Skills:

- Python

- Spark, Pyspark

- Data Governance

- Cloud (AWS/Azure/GCP)


Main Responsibilities:

- Define and set up analytics environments for "Big Data" applications in collaboration with domain experts.

- Implement ETL processes for telemetry-based and stationary test data.

- Support in defining data governance, including data lifecycle management.

- Develop large-scale data processing engines and real-time search and analytics based on time series data.

- Ensure technical, methodological, and quality aspects.

- Support CI/CD processes.

- Foster know-how development and transfer, continuous improvement of leading technologies within Data Engineering.

- Collaborate with solution architects on the development of complex on-premise, hybrid, and cloud solution architectures.


Qualification Requirements:

- BSc, MSc, MEng, or PhD in Computer Science, Informatics/Telematics, Mathematics/Statistics, or a comparable engineering degree.

- Proficiency in Python and the PyData stack (Pandas/Numpy).

- Experience in high-level programming languages (C#/C++/Java).

- Familiarity with scalable processing environments like Dask (or Spark).

- Proficient in Linux and scripting languages (Bash Scripts).

- Experience in containerization and orchestration of containerized services (Kubernetes).

- Education in database technologies (SQL/OLAP and Non-SQL).

- Interest in Big Data storage technologies (Elastic, ClickHouse).

- Familiarity with Cloud technologies (Azure, AWS, GCP).

- Fluent English communication skills (speaking and writing).

- Ability to work constructively with a global team.

- Willingness to travel for business trips during development projects.


Preferable:

- Working knowledge of vehicle architectures, communication, and components.

- Experience in additional programming languages (C#/C++/Java, R, Scala, MATLAB).

- Experience in time-series processing.


How to Apply:

Interested candidates, please share your updated CV/resume with me.


Thank you for considering this exciting opportunity.

Read more
Blend360
at Blend360
1 recruiter
VasimAkram Shaik
Posted by VasimAkram Shaik
Hyderabad
5 - 13 yrs
Best in industry
Tableau
SQL
Business Intelligence (BI)
Spotfire
Qlikview
+3 more

Key Responsibilities:


•Design, development, support and maintain automated business intelligence products in Tableau.


•Rapidly design, develop and implement reporting applications that insert KPI metrics and actionable insights into the operational, tactical and strategic activities of key business functions.


•Develop strong communication skills with a proven success communicating with users, other tech teams.


•Identify business requirements, design processes that leverage/adapt the business logic and regularly communicate with business stakeholders to ensure delivery meets business needs.


•Design, code and review business intelligence projects developed in tools Tableau & Power BI.


•Work as a member and lead teams to implement BI solutions for our customers.


•Develop dashboards and data sources that meet and exceed customer requirements.


•Partner with business information architects to understand the business use cases that support and fulfill business and data strategy.


•Partner with Product Owners and cross functional teams in a collaborative and agile environment


•Provide best practices for data visualization and Tableau implementations.


•Work along with solution architect in RFI / RFP response solution design, customer presentations, demonstrations, POCs etc. for growth.



Desired Candidate Profile:


•6-10 years of programming experience and a demonstrated proficiency in Experience with Tableau Certifications in Tableau is highly preferred.


•Ability to architect and scope complex projects.


•Strong understanding of SQL and basic understanding of programming languages; experience with SAQL, SOQL, Python, or R a plus.


•Applied experience in Agile development processes (SCRUM)


•Ability to independently learn new technologies.


•Ability to show initiative and work independently with minimal direction.


•Presentation skills – demonstrated ability to simplify complex situations and ideas and distill them into compelling and effective written and oral presentations.


•Learn quickly – ability to understand and rapidly comprehend new areas, functional and technical, and apply detailed and critical thinking to customer solutions.



Education:


•Bachelor/master’s degree in Computer Science, Computer Engineering, quantitative studies, such as Statistics, Math, Operation Research, Economics and Advanced Analytics

Read more
Fintech Pioneer | GGN
Agency job
via Unnati by Astha Bharadwaj
NCR (Delhi | Gurgaon | Noida)
8 - 13 yrs
₹60L - ₹70L / yr
skill iconData Science
Data Scientist
skill iconPython
SQL
skill iconMachine Learning (ML)
+4 more
Join a leading MCommerce company, set your career on a flight towards success and growth.
 
Our client is one of the oldest fintech companies that is taking banking and financial services to all the customers through their online platform. Having served over 50 million customers in the last 15 years, it is enabling over 7mn banking transactions each month, with a network of nearly 2 lac merchants. Using its vast network of merchant outlets, the platform is reaching the lower and mid-income groups who deal in cash, for them to be able to remit money across the country digitally. It now plans to take its unique digital financial solutions to developing markets across the globe. As pioneers of mobile-based payment services in India, they empower Retailers, Individuals and Businesses to have an online presence and earn or save a little extra through the transactions.
 
As a Head - Data Science, you will be part of the leadership team and will be expected to manage ambiguity & help the Founders & other leaders in building the roadmap forward for the business.
 
You will be expected to adopt an "iron sharpens iron" attitude where you will focus on making everyone and every data-driven process better, blend people leadership/ management skills, use predictive modelling and analytics expertise, cloud computing skills and operational know-how.
 
What you will do:
  • Working closely with business stakeholders to define, strategize and execute crucial business problem statements which lie at the core of improvising current and future data-backed product offerings.
  • Building and refining underwriting models for extending credit to sellers and API Partners in collaboration with the lending team
  • Conceiving, planning and prioritizing data projects and manage timelines
  • Building analytical systems and predictive models as a part of the agile ecosystem
  • Testing performance of data-driven products participating in sprint-wise feature releases
  • Managing a team of data scientists and data engineers to develop, train and test predictive models
  • Managing collaboration with internal and external stakeholders
  • Building data-centric culture from within, partnering with every team, learning deeply about business, working with highly experienced, sharp and insanely ambitious colleagues
 

What you need to have:

  • B.Tech/ M.Tech/ MS/ PhD in Data Science / Computer Science, Statistics, Mathematics & Computation with a demonstrated skill-set in leading an Analytics and Data Science team from IIT, BITS Pilani, ISI
  • 8+ years working in the Data Science and analytics domain with 3+ years of experience in leading a data science team to understand the projects to be prioritized, how the team strategy aligns with the organization mission;
  • Deep understanding of credit risk landscape; should have built or maintained underwriting models for unsecured lending products
  • Should have handled a leadership team in a tech startup preferably a fintech/ lending/ credit risk startup.
  • We value entrepreneurship spirit: if you have had the experience of starting your own venture - that is an added advantage.
  • Strategic thinker with agility and endurance
  • Aware of the latest industry trends in Data Science and Analytics with respect to Fintech, Digital Transformations and Credit-lending domain
  • Excellent command over communication is the key to manage multiple stakeholders like the leadership team, product teams, existing & new investors.
  • Cloud Computing, Python, SQL, ML algorithms, Analytics and problem - solving mindset
  • Knowledge and demonstrated skill-sets in AWS
Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
Teradata
Vertica
skill iconPython
DBA
Redshift
+8 more
  • Key responsibility is to design, develop & maintain efficient Data models for the organization maintained to ensure optimal query performance by the consumption layer.
  • Developing, Deploying & maintaining a repository of UDXs written in Java / Python.
  • Develop optimal Data Model design, analyzing complex distributed data deployments, and making recommendations to optimize performance basis data consumption patterns, performance expectations, the query is executed on the tables/databases, etc.
  • Periodic Database health check and maintenance
  • Designing collections in a no-SQL Database for efficient performance
  • Document & maintain data dictionary from various sources to enable data governance
  • Coordination with Business teams, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
  • Data Governance Process Implementation and ensuring data security

Requirements

  • Extensive working experience in Designing & Implementing Data models in OLAP Data Warehousing solutions (Redshift, Synapse, Snowflake, Teradata, Vertica, etc).
  • Programming experience using Python / Java.
  • Working knowledge in developing & deploying User-defined Functions (UDXs) using Java / Python.
  • Strong understanding & extensive working experience in OLAP Data Warehousing (Redshift, Synapse, Snowflake, Teradata, Vertica, etc) architecture and cloud-native Data Lake (S3, ADLS, BigQuery, etc) Architecture.
  • Strong knowledge in Design, Development & Performance tuning of 3NF/Flat/Hybrid Data Model.
  • Extensive technical experience in SQL including code optimization techniques.
  • Strung knowledge of database performance and tuning, troubleshooting, and tuning.
  • Knowledge of collection design in any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc), along with implementation of best practices.
  • Ability to understand business functionality, processes, and flows.
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently.
  • Any OLAP DWH DBA Experience and User Management will be added advantage.
  • Knowledge in financial industry-specific Data models such as FSLDM, IBM Financial Data Model, etc will be added advantage.
  • Experience in Snowflake will be added advantage.
  • Working experience in BFSI/NBFC & data understanding of Loan/Mortgage data will be added advantage.

Functional knowledge

  • Data Governance & Quality Assurance
  • Modern OLAP Database Architecture & Design
  • Linux
  • Data structures, algorithm & data modeling techniques
  • No-SQL database architecture
  • Data Security

 

Read more
Falcon Autotech
at Falcon Autotech
1 recruiter
Rohit Kaushik
Posted by Rohit Kaushik
Noida
3 - 7 yrs
₹4L - ₹7L / yr
skill iconData Analytics
Data Analyst
Tableau
MySQL
SQL
  • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy
  • Expertise of SQL/PL-SQL -ability to write procedures and create queries for reporting purpose.
  • Must have worked on a reporting tool – Power BI/Tableau etc.
  • Strong knowledge of excel/Google Sheets – must have worked with pivot tables, aggregate functions, logical if conditions.
  • Strong verbal and written communication skills for coordination with departments.
  • An analytical mind and inclination for problem-solving
Read more
Mobile Programming LLC
at Mobile Programming LLC
1 video
34 recruiters
Apurva kalsotra
Posted by Apurva kalsotra
Mohali, Gurugram, Bengaluru (Bangalore), Chennai, Hyderabad, Pune
3 - 8 yrs
₹3L - ₹9L / yr
Data Warehouse (DWH)
Big Data
Spark
Apache Kafka
Data engineering
+14 more
Day-to-day Activities
Develop complex queries, pipelines and software programs to solve analytics and data mining problems
Interact with other data scientists, product managers, and engineers to understand business problems, technical requirements to deliver predictive and smart data solutions
Prototype new applications or data systems
Lead data investigations to troubleshoot data issues that arise along the data pipelines
Collaborate with different product owners to incorporate data science solutions
Maintain and improve data science platform
Must Have
BS/MS/PhD in Computer Science, Electrical Engineering or related disciplines
Strong fundamentals: data structures, algorithms, database
5+ years of software industry experience with 2+ years in analytics, data mining, and/or data warehouse
Fluency with Python
Experience developing web services using REST approaches.
Proficiency with SQL/Unix/Shell
Experience in DevOps (CI/CD, Docker, Kubernetes)
Self-driven, challenge-loving, detail oriented, teamwork spirit, excellent communication skills, ability to multi-task and manage expectations
Preferred
Industry experience with big data processing technologies such as Spark and Kafka
Experience with machine learning algorithms and/or R a plus 
Experience in Java/Scala a plus
Experience with any MPP analytics engines like Vertica
Experience with data integration tools like Pentaho/SAP Analytics Cloud
Read more
The other Fruit
at The other Fruit
1 video
3 recruiters
Dipendra SIngh
Posted by Dipendra SIngh
Pune
1 - 5 yrs
₹3L - ₹15L / yr
skill iconMachine Learning (ML)
Artificial Intelligence (AI)
skill iconPython
Data Structures
Algorithms
+17 more
 
SD (ML and AI) job description:

Advanced degree in computer science, math, statistics or a related discipline ( Must have master degree )
Extensive data modeling and data architecture skills
Programming experience in Python, R
Background in machine learning frameworks such as TensorFlow or Keras
Knowledge of Hadoop or another distributed computing systems
Experience working in an Agile environment
Advanced math skills (Linear algebra
Discrete math
Differential equations (ODEs and numerical)
Theory of statistics 1
Numerical analysis 1 (numerical linear algebra) and 2 (quadrature)
Abstract algebra
Number theory
Real analysis
Complex analysis
Intermediate analysis (point set topology)) ( important )
Strong written and verbal communications
Hands on experience on NLP and NLG
Experience in advanced statistical techniques and concepts. ( GLM/regression, Random forest, boosting, trees, text mining ) and experience with application.
 
Read more
Bengaluru (Bangalore)
5 - 7 yrs
₹14.5L - ₹16.5L / yr
skill iconData Science
Data scientist
skill iconData Analytics
skill iconMachine Learning (ML)
skill iconPython
+2 more
  • Actively engage with internal business teams to understand their challenges and deliver robust, data-driven solutions.
  • Work alongside global counterparts to solve data-intensive problems using standard analytical frameworks and tools.
  • Be encouraged and expected to innovate and be creative in your data analysis, problem-solving, and presentation of solutions.
  • Network and collaborate with a broad range of internal business units to define and deliver joint solutions.
  • Work alongside customers to leverage cutting-edge technology (machine learning, streaming analytics, and ‘real’ big data) to creatively solve problems and disrupt existing business models.

In this role, we are looking for:

  • A problem-solving mindset with the ability to understand business challenges and how to apply your analytics expertise to solve them.
  • The unique person who can present complex mathematical solutions in a simple manner that most will understand, including customers.
  • An individual excited by innovation and new technology and eager to finds ways to employ these innovations in practice.
  • A team mentality, empowered by the ability to work with a diverse set of individuals.

Basic Qualifications

  • A Bachelor’s degree in Data Science, Math, Statistics, Computer Science or related field with an emphasis on analytics.
  • 5+ Years professional experience in a data scientist/analyst role or similar.
  • Proficiency in your statistics/analytics/visualization tool of choice, but preferably in the Microsoft Azure Suite, including Azure ML Studio and PowerBI as well as R, Python, SQL.

Preferred Qualifications

  • Excellent communication, organizational transformation, and leadership skills
  • Demonstrated excellence in Data Science, Business Analytics and Engineering

 

 

 

 

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos