11+ Technical analysis Jobs in Pune | Technical analysis Job openings in Pune
Apply to 11+ Technical analysis Jobs in Pune on CutShort.io. Explore the latest Technical analysis Job opportunities across top companies like Google, Amazon & Adobe.
Application development:
Development of forms and reports.
Creation of SQL packages, functions and procedures etc.
Should have extensively worked on oracle EBS implementation/ upgrade/ support projects/ tools JIRA.
Profile : Senior Data Engineer (Azure)
Experience Required : 6+ Years
Work Mode : Hybrid
Location : Gurgaon, Pune, Jaipur, Bangalore, Bhopal
Employment Type : Full-time
About the Role :
We are seeking an experienced Senior Data Engineer with strong expertise in Microsoft Azure ecosystem to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining enterprise-scale data solutions on Azure, working with cutting-edge cloud technologies and big data platforms.
Key Responsibilities :
- Data Processing on Azure : Azure Data Factory, Streaming Analytics, Event Hubs, Azure Databricks, Data Migration Service, Data Pipeline.
- Provisioning, configuring, and developing Azure solutions (ADB, ADF, ADW, etc.).
- Design and implement scalable data models and migration strategies.
- Work on distributed big data batch or streaming pipelines (Kafka or similar).
- Develop data integration and transformation solutions for structured and unstructured data.
- Collaborate with cross-functional teams for performance tuning and optimization.
- Monitor data workflows and ensure compliance with data governance and quality standards.
- Contribute to continuous improvement through automation and DevOps practices.
Required Skills & Experience :
- 6- 10 years of experience as a Data Engineer.
- Strong proficiency in Azure Databricks, PySpark, Python, SQL, and Azure Data Factory.
- Experience in Data Modelling, Data Migration, and Data Warehousing.
- Good understanding of database structure principles and schema design.
- Hands-on experience using MS SQL Server, Oracle, or similar RDBMS platforms.
- Experience in DevOps tools (Azure DevOps, Jenkins, Airflow, Azure Monitor) good to have.
- Knowledge of distributed data processing and real-time streaming (Kafka/Event Hub).
- Familiarity with visualization tools like Power BI or Tableau.
- Strong analytical, problem-solving, and debugging skills.
- Self-motivated, detail-oriented, and capable of managing priorities effectively.
Job Description:
Min 2 to 4 Years of experience in C# and ASP.NET, Web application development.
Knowledge about cloud programming or migration to cloud is preferred.
Mandatory skills:
- Proficient in Web Application development using ASP.NET, C# with .Net version 4.0/ 4.5.
- Experience with SQL Server or any other equivalent Database and know how to build efficient queries.
- Strong knowledge on jQuery, AJAX, JavaScript, HTML5, CSS3 and Bootstrap.
- Experience in debugging in multiple browsers.
- Strong understanding of object-oriented programming.
- Clear understanding of SVN or an equivalent VCS.
- Familiar with IIS and deploying code to Web Server.
- Should have excellent analytical and communication skills.
Responsibilities:
- Good hands on designing, coding, debugging, technical problem solving, and writing Unit Test cases, etc.
- Translate use cases into functional applications
- Design, build, and maintain efficient, reusable, and reliable C# code
- Ensure the best possible performance, quality, and responsiveness of applications
- Help maintain code quality
- Able to work well in a team setting
Academic Qualifications Required:
- B.E. / B.Tech. /MSC in Computer Science or IT./
M.C.A
- Bachelor's degree required, or higher education level, or foreign equivalent, preferably in area wit
- At least 5 years experience in Duck Creek Data Insights as Technical Architect/Senior Developer.
- Strong Technical knowledge on SQL databases, MSBI.
- Should have strong hands-on knowledge on Duck Creek Insight product, SQL Server/DB level configuration, T-SQL, XSL/XSLT, MSBI etc
- Well versed with Duck Creek Extract Mapper Architecture
- Strong understanding of Data Modelling, Data Warehousing, Data Marts, Business Intelligence with ability to solve business problems
- Strong understanding of ETL and EDW toolsets on the Duck Creek Data Insights
- Strong knowledge on Duck Creek Insight product overall architecture flow, Data hub, Extract mapper etc
- Understanding of data related to business application areas policy, billing, and claims business solutions
- Minimum 4 to 7 year working experience on Duck Creek Insights product
- Strong Technical knowledge on SQL databases, MSBI
- Preferable having experience in Insurance domain
- Preferable experience in Duck Creek Data Insights
- Experience specific to Duck Creek would be an added advantage
- Strong knowledge of database structure systems and data mining
- Excellent organisational and analytical abilities
- Outstanding problem solver
- Key areas of responsibilities:-
- Maintenance and enhancements of current websites.
- System requirement gathering, development, testing and maintenance for new websites or applications.
- Managing user access to the various components of the company’s IT infrastructure based on an assessment of the user requirement and security.
- Managing the admin section for the company’s websites.
Creating an environment for Dreamweaver editor for users to develop applications. - Documentation Designing a database and coordinating the UI design.
- Integrating the MySQL database with the PHP applications.
Maintaining web-based PHP applications. - Designing and developing applications and websites.
Resolving issues related to PHP development in different applications. - Experience and skills required:-
Skills : PHP, Mysql, MVC, JS, JQUERY,HTML,CSS, AJAX , WORDPRESS.
Experience : Min 3 year Qualification: Any technical background (MCA, BE, MSC Computer etc)
• Strong in Java 8 - streaming and non-blocking APIs
• Strong in Collection, generics, Data Structure and multi-threading
• Designing patterns and SOLID principles
• Problem solving and hands-on in writing complex implementation
• Responsible for performing daily operational tasks and maintaining availability at the customer site(s). Provisions solutions based on standardized procedures as outlined by Dell Technology best practice documentation. • Excellent troubleshooting skills with proven track record of successful incident management • Participates in the design and operational execution of the customer's disaster recovery process as required. Performs necessary storage infrastructure maintenance and necessary data migration, as required. • Replication sessions- Troubleshooting if any error encountered on the replication sessions. Establishing replication sessions for new sites. • Seeks advice or assistance from management and/or Technical Support as required during difficult customer situations. Works in conjunction with EMC colleagues to ensure effective resolution of technical issues encountered during implementations • Mentor junior staff • Good understanding of the Operating systems (Windows and Linux)
Essential Requirements:
• Specialist skills/ knowledge of VMAX, Powermax, VNXBlock, Unity, Isilon VPLEX and SAN Switches along with support knowledge of SRM along with CISCO, Brocade • Proficiency in hardware, software and/or operating systems environments. • Provide on call support. • Expert certification in relevant technology /product • Presentation and negotiating skills. • Organizational skills.
Experience:
• 5+ years of relevant experience
You can code comfortably in Python
Working knowledge of streaming media protocols, technologies, and standards (streaming, compression, and transcoding): HTTP Live Streaming (HLS), RTMP, RTSP, etc.
Good grasp of Linux and cloud server AWS or Azure.
Working knowledge of Data Structures and Algorithms.
Working knowledge of SQL, NoSQL, Graph Databases (MySQL, MongoDB, Cassandra, Redis, SQL/JSON)
Working knowledge of API architectures, micro-services.
Needs to have a good working knowledge of GitHub, docker.
Working knowledge of Distributed computing and multi-processing.
GOOD TO KNOW
Should have knowledge of pandas, Luigi, Celery, Django, flask, package.
Should have interacted with big-data.
Should have interacted with message queuing tools like Kafka, zmq, etc.
You have exceptional knowledge of encryption, security & networking.




