Cutshort logo
Fragma Data Systems logo
SQL Developer
Fragma Data Systems's logo

SQL Developer

Sandhya JD's profile picture
Posted by Sandhya JD
5 - 7 yrs
₹10L - ₹18L / yr
Remote only
Skills
ETL
Informatica
Data Warehouse (DWH)
SQL
SSIS
SQL Developer with Relevant experience of 7 Yrs with Strong Communication Skills.
 
Key responsibilities:
 
 
  • Creating, designing and developing data models
  • Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
  • Validating results and creating business reports
  • Monitoring and tuning data loads and queries
  • Develop and prepare a schedule for a new data warehouse
  • Analyze large databases and recommend appropriate optimization for the same
  • Administer all requirements and design various functional specifications for data
  • Provide support to the Software Development Life cycle
  • Prepare various code designs and ensure efficient implementation of the same
  • Evaluate all codes and ensure the quality of all project deliverables
  • Monitor data warehouse work and provide subject matter expertise
  • Hands-on BI practices, data structures, data modeling, SQL skills
 
 

Experience
Experience Range

5 Years - 10 Years

Function Information Technology
Desired Skills
Must have Skills:  SQL

Hard Skills for a Data Warehouse Developer:
 
  • Hands-on experience with ETL tools e.g., DataStage, Informatica, Pentaho, Talend
  • Sound knowledge of SQL
  • Experience with SQL databases such as Oracle, DB2, and SQL
  • Experience using Data Warehouse platforms e.g., SAP, Birst
  • Experience designing, developing, and implementing Data Warehouse solutions
  • Project management and system development methodology
  • Ability to proactively research solutions and best practice
 
Soft Skills for Data Warehouse Developers:
 
  • Excellent Analytical skills
  • Excellent verbal and written communications
  • Strong organization skills
  • Ability to work on a team, as well as independently
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Fragma Data Systems

Founded :
2015
Type
Size
Stage :
Profitable
About

Fragma is a leading Big data, AI and Advanced analytics company provideing services global clients.

Read more
Connect with the team
Profile picture
Mallikarjun Degul
Profile picture
Sandhya JD
Profile picture
Varun Reddy
Profile picture
Priyanka U
Profile picture
Simpy kumari
Profile picture
Minakshi Kumari
Profile picture
Latha Yuvaraj
Profile picture
Vamsikrishna G
Company social profiles
bloglinkedintwitter

Similar jobs

Delhi, Gurugram, Noida, Ghaziabad, Faridabad
10 - 15 yrs
₹10L - ₹15L / yr
Data Warehouse (DWH)
Informatica
ETL
skill iconAmazon Web Services (AWS)
Migration

Greetings !!!


Looking Urgently !!!


Exp-Min 10 Years

Location-Delhi

Sal-nego



Role

AWS Data Migration Consultant

Provide Data Migration strategy, expert review and guidance on Data Migration from onprem to AWS infrastructure that includes AWS Fargate, PostgreSQL, DynamoDB. This includes review and SME inputs on:

·       Data migration plan, architecture, policies, procedures

·       Migration testing methodologies

·       Data integrity, consistency, resiliency.

·       Performance and Scalability

·       Capacity planning

·       Security, access control, encryption

·       DB replication and clustering techniques

·       Migration risk mitigation approaches

·       Verification and integrity testing, reporting (Record and field level verifications)

·       Schema consistency and mapping

·       Logging, error recovery

·       Dev-test, staging and production artifact promotions and deployment pipelines

·       Change management

·       Backup, DR approaches and best practices.


Qualifications

  • Worked on mid to large scale data migration projects, specifically from on-prem to AWS, preferably in BFSI domain
  • Deep expertise in AWS Redshift, PostgreSQL, DynamoDB from data management, performance, scalability and consistency standpoint
  • Strong knowledge of AWS Cloud architecture and components, solutions, well architected frameworks
  • Expertise in SQL and DB performance related aspects
  • Solution Architecture work for enterprise grade BFSI applications
  • Successful track record of defining and implementing data migration strategies
  • Excellent communication and problem solving skills
  • 10+ Yrs experience in Technology, at least 4+yrs in AWS and DBA/DB Management/Migration related work
  • Bachelors degree or higher in Engineering or related field


Read more
Quicken Inc
at Quicken Inc
2 recruiters
Shreelakshmi M
Posted by Shreelakshmi M
Bengaluru (Bangalore)
5 - 8 yrs
Best in industry
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
ETL QA
+1 more
  • Graduate+ in Mathematics, Statistics, Computer Science, Economics, Business, Engineering or equivalent work experience.
  • Total experience of 5+ years with at least 2 years in managing data quality for high scale data platforms.
  • Good knowledge of SQL querying.
  • Strong skill in analysing data and uncovering patterns using SQL or Python.
  • Excellent understanding of data warehouse/big data concepts such data extraction, data transformation, data loading (ETL process).
  • Strong background in automation and building automated testing frameworks for data ingestion and transformation jobs.
  • Experience in big data technologies a big plus.
  • Experience in machine learning, especially in data quality applications a big plus.
  • Experience in building data quality automation frameworks a big plus.
  • Strong experience working with an Agile development team with rapid iterations. 
  • Very strong verbal and written communication, and presentation skills.
  • Ability to quickly understand business rules.
  • Ability to work well with others in a geographically distributed team.
  • Keen observation skills to analyse data, highly detail oriented.
  • Excellent judgment, critical-thinking, and decision-making skills; can balance attention to detail with swift execution.
  • Able to identify stakeholders, build relationships, and influence others to get work done.
  • Self-directed and self-motivated individual who takes complete ownership of the product and its outcome.
Read more
Tredence
Suchismita Das
Posted by Suchismita Das
Bengaluru (Bangalore), Gurugram, Chennai, Pune
8 - 10 yrs
Best in industry
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
skill iconR Programming
SQL
+1 more

THE IDEAL CANDIDATE WILL

 

  • Engage with executive level stakeholders from client's team to translate business problems to high level solution approach
  • Partner closely with practice, and technical teams to craft well-structured comprehensive proposals/ RFP responses clearly highlighting Tredence’s competitive strengths relevant to Client's selection criteria
  • Actively explore the client’s business and formulate solution ideas that can improve process efficiency and cut cost, or achieve growth/revenue/profitability targets faster
  • Work hands-on across various MLOps problems and provide thought leadership
  • Grow and manage large teams with diverse skillsets
  • Collaborate, coach, and learn with a growing team of experienced Machine Learning Engineers and Data Scientists

 

 

 

ELIGIBILITY CRITERIA

 

  • BE/BTech/MTech (Specialization/courses in ML/DS)
  • At-least 7+ years of Consulting services delivery experience
  • Very strong problem-solving skills & work ethics
  • Possesses strong analytical/logical thinking, storyboarding and executive communication skills
  • 5+ years of experience in Python/R, SQL
  • 5+ years of experience in NLP algorithms, Regression & Classification Modelling, Time Series Forecasting
  • Hands on work experience in DevOps
  • Should have good knowledge in different deployment type like PaaS, SaaS, IaaS
  • Exposure on cloud technologies like Azure, AWS or GCP
  • Knowledge in python and packages for data analysis (scikit-learn, scipy, numpy, pandas, matplotlib).
  • Knowledge of Deep Learning frameworks: Keras, Tensorflow, PyTorch, etc
  • Experience with one or more Container-ecosystem (Docker, Kubernetes)
  • Experience in building orchestration pipeline to convert plain python models into a deployable API/RESTful endpoint.
  • Good understanding of OOP & Data Structures concepts

 

 

Nice to Have:

 

  • Exposure to deployment strategies like: Blue/Green, Canary, AB Testing, Multi-arm Bandit
  • Experience in Helm is a plus
  • Strong understanding of data infrastructure, data warehouse, or data engineering

 

You can expect to –

  • Work with world’ biggest retailers and help them solve some of their most critical problems. Tredence is a preferred analytics vendor for some of the largest Retailers across the globe
  • Create multi-million Dollar business opportunities by leveraging impact mindset, cutting edge solutions and industry best practices.
  • Work in a diverse environment that keeps evolving
  • Hone your entrepreneurial skills as you contribute to growth of the organization

 

 

Read more
Srijan Technologies
at Srijan Technologies
6 recruiters
PriyaSaini
Posted by PriyaSaini
Remote only
3 - 8 yrs
₹5L - ₹12L / yr
skill iconData Analytics
Data modeling
skill iconPython
PySpark
ETL
+3 more

Role Description:

  • You will be part of the data delivery team and will have the opportunity to develop a deep understanding of the domain/function.
  • You will design and drive the work plan for the optimization/automation and standardization of the processes incorporating best practices to achieve efficiency gains.
  • You will run data engineering pipelines, link raw client data with data model, conduct data assessment, perform data quality checks, and transform data using ETL tools.
  • You will perform data transformations, modeling, and validation activities, as well as configure applications to the client context. You will also develop scripts to validate, transform, and load raw data using programming languages such as Python and / or PySpark.
  • In this role, you will determine database structural requirements by analyzing client operations, applications, and programming.
  • You will develop cross-site relationships to enhance idea generation, and manage stakeholders.
  • Lastly, you will collaborate with the team to support ongoing business processes by delivering high-quality end products on-time and perform quality checks wherever required.

Job Requirement:

  • Bachelor’s degree in Engineering or Computer Science; Master’s degree is a plus
  • 3+ years of professional work experience with a reputed analytics firm
  • Expertise in handling large amount of data through Python or PySpark
  • Conduct data assessment, perform data quality checks and transform data using SQL and ETL tools
  • Experience of deploying ETL / data pipelines and workflows in cloud technologies and architecture such as Azure and Amazon Web Services will be valued
  • Comfort with data modelling principles (e.g. database structure, entity relationships, UID etc.) and software development principles (e.g. modularization, testing, refactoring, etc.)
  • A thoughtful and comfortable communicator (verbal and written) with the ability to facilitate discussions and conduct training
  • Strong problem-solving, requirement gathering, and leading.
  • Track record of completing projects successfully on time, within budget and as per scope

Read more
Bengaluru (Bangalore)
4 - 9 yrs
₹15L - ₹18L / yr
Azure data factory
Azure Data factory
Azure Data Engineer
SQL
SQL Azure
+2 more
  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Author data services using a variety of programming languages
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Azure ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • Work in an Agile environment with Scrum teams.
  • Ensure data quality and help in achieving data governance.


Basic Qualifications
  • 2+ years of experience in a Data Engineer role
  • Undergraduate degree required (Graduate degree preferred) in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Experience using the following software/tools:
  • Experience with big data tools: Hadoop, Spark, Kafka, etc.
  • Experience with relational SQL and NoSQL databases
  • Experience with data pipeline and workflow management tools
  • Experience with Azure cloud services: ADLS, ADF, ADLA, AAS
  • Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  • Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
  • Understanding of ELT and ETL patterns and when to use each. Understanding of data models and transforming data into the models
  • Experience building and optimizing ‘big data’ data pipelines, architectures, and data sets
  • Strong analytic skills related to working with unstructured datasets
  • Build processes supporting data transformation, data structures, metadata, dependency, and workload management
  • Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
  • Experience supporting and working with cross-functional teams in a dynamic environment
Read more
Product based Company
Agency job
via Crewmates by Gowtham V
Coimbatore
4 - 15 yrs
₹5L - ₹25L / yr
ETL
Big Data
Hi Professionals,
We are looking for ETL Developer for Reputed Client @ Coimbatore Permanent role
Work Location : Coimbatore
Experience : 4+ Years
Skills ;
  •  Talend (or)Strong experience in any of the ETL Tools like (Informatica/Datastage/Talend)
  • DB preference (Teradata /Oracle /Sql server )
  • Supporting Tools (JIRA/SVN)
Notice Period : Immediate to 30 Days
Read more
Saviance Technologies
at Saviance Technologies
1 recruiter
Shipra Agrawal
Posted by Shipra Agrawal
NCR (Delhi | Gurgaon | Noida)
3 - 5 yrs
₹7L - ₹9L / yr
PowerBI
power bi
Business Intelligence (BI)
DAX
Data modeling
+3 more

 

Job Title: Power BI Developer(Onsite)

Location: Park Centra, Sec 30, Gurgaon

CTC:        8 LPA

Time:       1:00 PM - 10:00 PM

  

Must Have Skills: 

  • Power BI Desktop Software
  • Dax Queries
  • Data modeling
  • Row-level security
  • Visualizations
  • Data Transformations and filtering
  • SSAS and SQL

 

Job description:

 

We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.

 

Roles and Responsibilities:

 

  • Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
  • Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
  • Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
  • Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
  • Manage upstream and downstream impact of all changes on automated reporting/dashboards
  • Independently apply problem-solving ability to identify meaningful insights to business
  • Identify automation opportunities and work with a wide range of stakeholders to implement the same.
  • The ability and self-confidence to work independently and increase the scope of the service line

 

Requirements: 

  • 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
  • Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
  • Experience in leading and mentoring a small team.

 

 

 

Read more
TechUnity Software Systems India Pvt Ltd;
Coimbatore
2 - 5 yrs
₹3L - ₹4L / yr
Data Visualization
SQL
Stackless Python
skill iconR Programming
matplotlib
+4 more

We are looking for a Data Scientist to analyze large amounts of raw information to find patterns that will help improve our company.
We will rely on you to build data products to extract valuable business insights.
In this role, you should be highly analytical with a knack for analysis, math and statistics.
Critical thinking and problem-solving skills are essential for interpreting data.
We also want to see a passion for machine-learning and research.
Your goal will be to help our company analyze trends to make better decisions.
Responsibilities:

  • Identify valuable data sources and automate collection processes
  • Undertake preprocessing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

Requirements:

  • Proven experience as a Data Scientist or Data Analyst
  • Experience in data mining
  • Understanding of machine-learning and operations research
  • Knowledge of SQL,Python,R,ggplot2, matplotlib, seaborn, Shiny, Dash; familiarity with Scala, Java or C++ is an asset
  • Experience using business intelligence tools (e.g. Tableau) and data frameworks
  • Analytical mind and business acumen
  • Strong math skills in statistics, algebra
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • BSc/BE in Computer Science, Engineering or relevant field;
  • graduate degree in Data Science or other quantitative field is preferred
Read more
fintech
Agency job
via Talentojcom by Raksha Pant
Remote only
2 - 6 yrs
₹9L - ₹30L / yr
ETL
Druid Database
skill iconJava
skill iconScala
SQL
+2 more
● Education in a science, technology, engineering, or mathematics discipline, preferably a
bachelor’s degree or equivalent experience
● Knowledge of database fundamentals and fluency in advanced SQL, including concepts
such as windowing functions
● Knowledge of popular scripting languages for data processing such as Python, as well as
familiarity with common frameworks such as Pandas
● Experience building streaming ETL pipelines with tools such as Apache Flink, Apache
Beam, Google Cloud Dataflow, DBT and equivalents
● Experience building batch ETL pipelines with tools such as Apache Airflow, Spark, DBT, or
custom scripts
● Experience working with messaging systems such as Apache Kafka (and hosted
equivalents such as Amazon MSK), Apache Pulsar
● Familiarity with BI applications such as Tableau, Looker, or Superset
● Hands on coding experience in Java or Scala
Read more
Helical IT Solutions
at Helical IT Solutions
4 recruiters
Niyotee Gupta
Posted by Niyotee Gupta
Hyderabad
1 - 5 yrs
₹3L - ₹8L / yr
ETL
Big Data
TAC
PL/SQL
Relational Database (RDBMS)
+1 more

ETL Developer – Talend

Job Duties:

  • ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,

best practices and are maintainable, modular and reusable.

  • Proficiency with Talend or Pentaho Data Integration / Kettle.
  • ETL Developer will analyze and review complex object and data models and the metadata

repository in order to structure the processes and data for better management and efficient

access.

  • Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
  • Training and mentoring Junior Analysts and building their proficiency in the ETL process.
  • Preparing mapping document to extract, transform, and load data ensuring compatibility with

all tables and requirement specifications.

  • Experience in ETL system design and development with Talend / Pentaho PDI is essential.
  • Create quality rules in Talend.
  • Tune Talend / Pentaho jobs for performance optimization.
  • Write relational(sql) and multidimensional(mdx) database queries.
  • Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &

Load balancing setup, and all its administrative functions.

  • Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,

dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,

and analytical models.

  • Exposure in Map Reduce components of Talend / Pentaho PDI.
  • Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and

maintenance.

  • Working knowledge of relational database theory and dimensional database models.
  • Creating and deploying Talend / Pentaho custom components is an add-on advantage.
  • Nice to have java knowledge.

Skills and Qualification:

  • BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
  • Having an experience of 3+ years.
  • Proficiency with Talend or Pentaho Data Integration / Kettle.
  • Ability to work independently.
  • Ability to handle a team.
  • Good written and oral communication skills.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos