12+ SQL Azure Jobs in Mumbai | SQL Azure Job openings in Mumbai
Apply to 12+ SQL Azure Jobs in Mumbai on CutShort.io. Explore the latest SQL Azure Job opportunities across top companies like Google, Amazon & Adobe.
Required Skill Set :--
- Data Model & Mapping
- MS SQL Database
- Analytics SQL Query
- Genesys Cloud Reporting & Analytics API
- Snow Flake (Good to have)
- Cloud Exposure – AWS or Azure
Technical Experience –
· 5 - 8 Years of experience, preferable at technology or Financial firm
· Strong understanding of data analysis & reporting tools.
· Experience with data mining & machine learning techniques.
· Excellent communication & presentation skills
· Must have at least 2 – 3 years of experience in Data Model/Analysis /mapping
· Must have hands on experience in database tools & technologies
· Must have exposure to Genesys cloud, WFM, GIM, Genesys Analytics API
· Good to have experience or exposure on salesforce, AWS or AZUre , & Genesys cloud
· Ability to work independently & as part of a team
· Strong attention to detail and accuracy.
Work Scope –
- Data Model similar GIM database based on the Genesys Cloud data.
- API to column data mapping.
- Data Model for business for Analytics
- Data base artifacts
- Scripting – Python
- Autosys, TWS job setup.
Data Scientist – Program Embedded
Job Description:
We are seeking a highly skilled and motivated senior data scientist to support a big data program. The successful candidate will play a pivotal role in supporting multiple projects in this program covering traditional tasks from revenue management, demand forecasting, improving customer experience to testing/using new tools/platforms such as Copilot Fabric for different purpose. The expected candidate would have deep expertise in machine learning methodology and applications. And he/she should have completed multiple large scale data science projects (full cycle from ideation to BAU). Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. This is a data science role directly embedded into the program/projects, stake holder management and collaborations with patterner are crucial to the success on this role (on top of the deep expertise).
What we are looking for:
- Highly efficient in Python/Pyspark/R.
- Understand MLOps concepts, working experience in product industrialization (from Data Science point of view). Experience in building product for live deployment, and continuous development and continuous integration.
- Familiar with cloud platforms such as Azure, GCP, and the data management systems on such platform. Familiar with Databricks and product deployment on Databricks.
- Experience in ML projects involving techniques: Regression, Time Series, Clustering, Classification, Dimension Reduction, Anomaly detection with traditional ML approaches and DL approaches.
- Solid background in statistics, probability distributions, A/B testing validation, univariate/multivariate analysis, hypothesis test for different purpose, data augmentation etc.
- Familiar with designing testing framework for different modelling practice/projects based on business needs.
- Exposure to Gen AI tools and enthusiastic about experimenting and have new ideas on what can be done.
- If they have improved an internal company process using an AI tool, that would be great (e.g. process simplification, manual task automation, auto emails)
- Ideally, 10+ years of experience, and have been on independent business facing roles.
- CPG or retail as a data scientist would be nice, but not number one priority, especially for those who have navigated through multiple industries.
- Being proactive and collaborative would be essential.
Some projects examples within the program:
- Test new tools/platforms such as Copilo, Fabric for commercial reporting. Testing, validation and build trust.
- Building algorithms for predicting trend in category, consumptions to support dashboards.
- Revenue Growth Management, create/understand the algorithms behind the tools (can be built by 3rd parties) we need to maintain or choose to improve. Able to prioritize and build product roadmap. Able to design new solutions and articulate/quantify the limitation of the solutions.
- Demand forecasting, create localized forecasts to improve in store availability. Proper model monitoring for early detection of potential issues in the forecast focusing particularly on improving the end user experience.
KEY RESPONSIBILITIES
· Develop high-quality database solutions.
· Use T-SQL to develop and implement procedures and functions.
· Review and interpret ongoing business report requirements.
· Research required data.
· Build appropriate and useful reporting deliverables.
· Analyze existing SQL queries for performance improvements.
· Suggest new queries.
· Develop procedures and scripts for data migration.
· Provide timely scheduled management reporting.
· Investigate exceptions with regard to asset movements.
MUST-HAVES FOR THIS GIG
T-SQL, Stored Procedure, Functions, Triggers, XML Operations, JSON support on SQL 2016 and above SSIS, SSRS, CTE, EAV Data structure, Integration with NoSQL(MongoDB), SQL Server Indexes, Bulk Insert, BCP, CMD Shell ,Memory Optimization, Performance Tunning, Query Optimization, Database Designing, Table Joins, SQL Server Job agent
Backup and Maintenance plan ,Data Migration, Good Communication
NICE-TO-HAVES FOR THIS GIG:
- Working knowledge of mobile development activity.
- Working knowledge of web hosting solution on IIS7.
Experience working with an offshore –onsite development process
Title:- Data Scientist
Experience:-6 years
Work Mode:- Onsite
Primary Skills:- Data Science, SQL, Python, Data Modelling, Azure, AWS, Banking Domain (BFSI/NBFC)
Qualification:- Any
Roles & Responsibilities:-
1. Acquiring, cleaning, and preprocessing raw data for analysis.
2. Utilizing statistical methods and tools for analyzing and interpreting complex datasets.
3. Developing and implementing machine learning models for predictive analysis.
4. Creating visualizations to effectively communicate insights to both technical and non-technical stakeholders.
5. Collaborating with cross-functional teams, including data engineers, business analysts, and domain experts.
6. Evaluating and optimizing the performance of machine learning models for accuracy and efficiency.
7. Identifying patterns and trends within data to inform business decision-making.
8. Staying updated on the latest advancements in data science, machine learning, and relevant technologies.
Requirement:-
1. Experience with modeling techniques such as Linear Regression, clustering, and classification techniques.
2. Must have a passion for data, structured or unstructured. 0.6 – 5 years of hands-on experience with Python and SQL is a must.
3. Should have sound experience in data mining, data analysis and machine learning techniques.
4. Excellent critical thinking, verbal and written communications skills.
5. Ability and desire to work in a proactive, highly engaging, high-pressure, client service environment.
6. Good presentation skills.
PLSQL Developer
experience of 4 to 6 years
Skills- MS SQl Server and Oracle, AWS or Azure
• Experience in setting up RDS service in cloud technologies such as AWS or Azure
• Strong proficiency with SQL and its variation among popular databases
• Should be well-versed in writing stored procedures, functions, packages, using collections,
• Skilled at optimizing large, complicated SQL statements.
• Should have worked in migration projects.
• Should have worked on creating reports.
• Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.
• Knowledge of best practices when dealing with relational databases
• Capable of troubleshooting common database issues
• Familiar with tools that can aid with profiling server resource usage and optimizing it.
• Proficient understanding of code versioning tools such as Git and SVN
With a leading Business Process Management (BPM) company
Job Summary
- Candidate will be responsible for providing full life-cycle development (design, coding, and testing) and maintenance of web-based system on Azure
- Candidate should have experience in GitHub, knowledge of DevOps is a plus
- Experienced in designing and implementing web portals, experience with DNN is must
- Ability to work with multiple languages including C#, ASP.Net, MVC, Javascript and related libraries, HTML, Complex SQL queries, CSS, BootStrap, JSON.
- Experience in Agile project management methodology
- Developing and Delivering Excellent Web based solutions/portals/sites based on customer’s requirement within the stipulated timeline
- The candidate should be flexible to learn new technology and platform and should be creative, innovative for improvement ideas, detail oriented, diligent, and eager to learn and grow
Duties and Responsibilities
- Understand business requirements to apply logic to integrate functionalities
- Identify and understand any technical bugs on the server, site, log files or modules and work on resolving the bugs
- Understand how FTP server is setup for the site
- Understand system/site technical requirements and suggest enhancements if applicable
- Designing, coding, unit Testing, and integration with Database
- Handle site deployment
- Designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.
Qualifications
Education / Certification
- B.E. / B.Tech. /MSC in Computer Science or IT.
- MCAD/MCSD/MSITP/MCPD
Technical Expertise
- ASP/ASP.NET/VB.NET/MVC/C#/SQL Server 2012+
- HTML, Javascript, Jquery, CSS, Bootstrap
- GitHub/DevOps, Azure
- Web API/ Web Services, Email Services
Skills and Abilities
- Be able to work with diverse global teams and in an individual contributor role as needed
- Excellent English written and verbal communication skills (for local team and global stakeholders/team members)
- Strong task management skills including time management, and ability to manage multiple projects simultaneously
- Flexibility required to attend late evening meetings with global team members
- Attention to detail and delivering quality projects and knowledge assets
Blenheim Chalcot IT Services India Pvt Ltd
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Wewill disclose the client name after theinitial screening.
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
- Building and operationalizing large scale enterprise data solutions and applications using one or more of AZURE data and analytics services in combination with custom solutions - Azure Synapse/Azure SQL DWH, Azure Data Lake, Azure Blob Storage, Spark, HDInsights, Databricks, CosmosDB, EventHub/IOTHub.
- Experience in migrating on-premise data warehouses to data platforms on AZURE cloud.
- Designing and implementing data engineering, ingestion, and transformation functions
- Experience with Azure Analysis Services
- Experience in Power BI
- Experience with third-party solutions like Attunity/Stream sets, Informatica
- Experience with PreSales activities (Responding to RFPs, Executing Quick POCs)
- Capacity Planning and Performance Tuning on Azure Stack and Spark.
Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
10FA India Private Limited.Formerly known as Prudential Glob
- Proficiency in Integration of various Azure resources (IaaS and PaaS - SQL DB , App Service , Application Insights , databricks , Storage accounts etc) to deliver an end to end automation.
- Thorough understanding of Continuous integration and continuous delivery using Azure DevOps/VSTS.
- Performing cost analysis of the Azure platform to identify where cost efficiencies could be had.
- Proficiency and thorough understanding of Azure RBAC model.
- Sound understanding of Azure Active directory and conditional access policies.
- Good grasp of Azure governance principles and hands-on experience in rolling out compliance and governance polices.
- Proficiency in developing infrastructure automation scripts in the form of ARM templates and Azure Power Shell scripts which can then be provided to application teams as consumables.
- Effective communication skills, both written and verbal for technical and non-technical audiences.
- Good working and hands on knowledge of Azure IaaS , Vnet , Subnets , Firewalls and NSG. Sound understanding of networking knowledge on DNA and Firewall security like Palo Alto.
- Experience working with Confluence, JIRA, Bitbucket, git, Jenkins, Sonar for collaboration and continuous integration.
- Experience with agile methods, along with having found their limitations and ways to overcome them.