11+ TAC Jobs in Hyderabad | TAC Job openings in Hyderabad
Apply to 11+ TAC Jobs in Hyderabad on CutShort.io. Explore the latest TAC Job opportunities across top companies like Google, Amazon & Adobe.
ETL Developer – Talend
Job Duties:
- ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,
best practices and are maintainable, modular and reusable.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- ETL Developer will analyze and review complex object and data models and the metadata
repository in order to structure the processes and data for better management and efficient
access.
- Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
- Training and mentoring Junior Analysts and building their proficiency in the ETL process.
- Preparing mapping document to extract, transform, and load data ensuring compatibility with
all tables and requirement specifications.
- Experience in ETL system design and development with Talend / Pentaho PDI is essential.
- Create quality rules in Talend.
- Tune Talend / Pentaho jobs for performance optimization.
- Write relational(sql) and multidimensional(mdx) database queries.
- Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &
Load balancing setup, and all its administrative functions.
- Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,
dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,
and analytical models.
- Exposure in Map Reduce components of Talend / Pentaho PDI.
- Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and
maintenance.
- Working knowledge of relational database theory and dimensional database models.
- Creating and deploying Talend / Pentaho custom components is an add-on advantage.
- Nice to have java knowledge.
Skills and Qualification:
- BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
- Having an experience of 3+ years.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- Ability to work independently.
- Ability to handle a team.
- Good written and oral communication skills.
- Oversee daily store operations, ensuring everything runs smoothly and efficiently.
- Manage and lead a team of sales associates, setting goals and providing coaching to ensure targets are met.
- Maintain high standards of customer service, addressing customer concerns and feedback.
- Monitor inventory levels, order stock, and manage product displays.
- Analyze sales data and prepare reports for senior management.
- Implement promotional activities and sales strategies to drive revenue.
- Ensure compliance with all company policies, health and safety regulations, and local laws.
- Conduct regular store audits and ensure the store's appearance is up to company standards.
- Handle store budgets, payroll, and scheduling efficiently.
Who We Are
At Sonatype, we help organizations build better, more secure software by enabling them to understand and control their software supply chains. Our products are trusted by thousands of engineering teams globally, providing critical insights into dependency health, license risk, and software security. We’re passionate about empowering developers—and we back it with data.
The Opportunity
We’re looking for a Data Engineer with full stack expertise to join our growing Data Platform team. This role blends data engineering, microservices, and full-stack development to deliver end-to-end services that power analytics, machine learning, and advanced search across Sonatype.
You will design and build data-driven microservices and workflows using Java, Python, and Spring Batch, implement frontends for data workflows, and deploy everything through CI/CD pipelines into AWS ECS/Fargate. You’ll also ensure services are monitorable, debuggable, and reliable at scale, while clearly documenting designs with Mermaid-based sequence and dataflow diagrams.
This is a hands-on engineering role for someone who thrives at the intersection of data systems, fullstack development, ML, and cloud-native platforms.
What You’ll Do
- Design, build, and maintain data pipelines, ETL/ELT workflows, and scalable microservices.
- Development of complex web scraping (Playwright) and realtime pipelines (Kafka/Queues/Flink).
- Develop end-to-end microservices with backend (Java 5+, Python 5+, Spring Batch 2+) and frontend (React or any).
- Deploy, publish, and operate services in AWS ECS/Fargate using CI/CD pipelines (Jenkins, GitOps).
- Architect and optimize data storage models in SQL (MySQL, PostgreSQL) and NoSQL stores.
- Implement web scraping and external data ingestion pipelines.
- Enable Databricks and PySpark-based workflows for large-scale analytics.
- Build advanced data search capabilities (fuzzy matching, vector similarity search, semantic retrieval).
- Apply ML techniques (scikit-learn, classification algorithms, predictive modeling) to data-driven solutions.
- Implement observability, debugging, monitoring, and alerting for deployed services.
- Create Mermaid sequence diagrams, flowcharts, and dataflow diagrams to document system architecture and workflows.
- Drive best practices in fullstack data service development, including architecture, testing, and documentation.
What We’re Looking For
- 5+ years of experience as a Data Engineer or a Software Backend engineering role.
- Strong programming skills in Python, Scala, or JavaHands-on experience with HBase or similar NoSQL columnar stores.
- Hands-on experience with distributed data systems like Spark, Kafka, or Flink.
- Proficient in writing complex SQL and optimizing queries for performance.
- Experience building and maintaining robust ETL/ELT pipelines in production.
- Familiarity with workflow orchestration tools (Airflow, Dagster, or similar).
- Understanding of data modeling techniques (star schema, dimensional modeling, etc.).
- Familiarity with CI/CD pipelines (Jenkins or similar).
- Ability to visualize and communicate architectures using Mermaid diagrams.
Bonus Points
- Experience working with Databricks, dbt, Terraform, or Kubernetes
- Familiarity with streaming data pipelines or real-time processing
- Exposure to data governance frameworks and tools
- Experience supporting data products or ML pipelines in production
- Strong understanding of data privacy, security, and compliance best practices
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software
- Modern tooling: Leverage the best of open-source and cloud-native technologies
- Collaborative culture: Join a passionate team that values learning, autonomy, and impact
Worksoft Certify
ROLE – Worksoft Certify Tester
Experience: 4 -13 years
Location – Blore/Chennai/Hyderabad
- Experience on SAP SD / MM functional testing experience
- Hands on experience in integrating Worksoft Certify with other applications like SolMan, HP ALM, selenium by using API's
- Requirement Analysis: Understand business requirements and translate them into test cases and scenarios for SAP modules.
- Test Planning: Develop test plans and strategies for SAP testing, considering both manual and automated testing approaches.
- Tool Configuration and Setup: Configure Worksoft automation tools to suit the specific requirements of SAP testing.
- Exposure in cloud applications
- Experience Design & implement automation framework
- Customize automation best practices experience
- Good communication, documentation & presentations skills.
- Experience in implementing Devops tools added advantage
- Experience & automation exposure in S4 journey is the plus
- Experience in building Test management & reporting Dashboard
Skill:
SAP Automation with Worksoft Certify
Mandatory
Web Automation with Worksoft Certify
Non-Mandatory
Execution manager or CTM
Mandatory
SAP Functional Knowledge on SD, MM & FICO
Mandatory
Excel Automation
Non-Mandatory
Role: Teamcenter Lead Developer
Experience: 6+yrs
Location: Bangalore/Hyderabad/Pune/Chennai
Notice Period: Immediate and 15 days only or max Upto 30 days
Key Responsibilities
- Core PLM Support & Expertise
- Manage and support key Teamcenter functionalities including:
- Parts & Components Management
- CAD Data Management & Integration
- BOM Management
- Change Management
- Supplier Collaboration
- Enterprise Data Integration
- Configuration & Customization
- Configure and customize Teamcenter modules to meet evolving business requirements
- Ensure alignment with PLM industry best practices and internal process standards
- Troubleshooting & End-User Support
- Investigate, diagnose, and resolve Teamcenter-related issues
- Deliver Tier 2/3 application support and guidance to end-users and business stakeholders
- Maintain system uptime and performance, ensuring minimal business disruption
- Code Implementation
- Develop and maintain clean, efficient code using Teamcenter-specific technologies such as:
- Java, C++, ITK (Integration Toolkit), SOA frameworks
- Support enhancements and new functionality development
- System Integration
- Collaborate with IT and engineering teams to integrate Teamcenter with ERP, CAD, and other enterprise platforms
- Enable seamless data flow and automation across interconnected systems
- Solution Development
- Design and deliver custom solutions within the Teamcenter platform, including:
- Workflows, UI modifications, reports, and extension modules
- Support ongoing innovation and platform optimization
Qualifications
- Bachelor’s degree in computer science, Engineering, or related technical field preferred
- 6+ years of experience supporting and developing within the Teamcenter PLM ecosystem
- Strong hands-on experience with Java, C++, ITK, and SOA for Teamcenter development
- Solid understanding of PLM best practices and enterprise system integration
- Proven problem-solving skills and the ability to manage technical escalations
Nice to Have
- Experience working in a global engineering or manufacturing environment
- Familiarity with Agile methodologies and DevOps principles
- Knowledge of enterprise systems like SAP, Oracle, or other major ERP platforms
· Core responsibilities to include analyze business requirements and designs for accuracy and completeness. Develops and maintains relevant product.
· BlueYonder is seeking a Senior/Principal Architect in the Data Services department (under Luminate Platform ) to act as one of key technology leaders to build and manage BlueYonder’ s technology assets in the Data Platform and Services.
· This individual will act as a trusted technical advisor and strategic thought leader to the Data Services department. The successful candidate will have the opportunity to lead, participate, guide, and mentor other people in the team on architecture and design in a hands-on manner. You are responsible for technical direction of Data Platform. This position reports to the Global Head, Data Services and will be based in Bangalore, India.
· Core responsibilities to include Architecting and designing (along with counterparts and distinguished Architects) a ground up cloud native (we use Azure) SaaS product in Order management and micro-fulfillment
· The team currently comprises of 60+ global associates across US, India (COE) and UK and is expected to grow rapidly. The incumbent will need to have leadership qualities to also mentor junior and mid-level software associates in our team. This person will lead the Data platform architecture – Streaming, Bulk with Snowflake/Elastic Search/other tools
Our current technical environment:
· Software: Java, Springboot, Gradle, GIT, Hibernate, Rest API, OAuth , Snowflake
· • Application Architecture: Scalable, Resilient, event driven, secure multi-tenant Microservices architecture
· • Cloud Architecture: MS Azure (ARM templates, AKS, HD insight, Application gateway, Virtue Networks, Event Hub, Azure AD)
· Frameworks/Others: Kubernetes, Kafka, Elasticsearch, Spark, NOSQL, RDBMS, Springboot, Gradle GIT, Ignite
Exp. Level: 5+ Yrs
Proven experience as a. NET Developer
Familiarity with the ASP. NET core framework, SQL Server, Web API, and Microservices
Understanding of Agile methodologies
Excellent troubleshooting and communication skills
Required Skills: ASP. Net Core Web API, and React Js
Kindly Share the Profile through LinkedIn
ID: linkedin.com/in/ranjini-c-n-36b674131
Skill- Spark and Scala along with Azure
Location - Pan India
Looking for someone Bigdata along with Azure
- Proficient with Objective-C,Swift,Cocoa Touch and UIKit.
- Experience with iOS frameworks such as Core Data, Core Animation, etc.
- Knowledge of Apple’s design principles,interface guidelines and UI/UX standards
- Experience with performance and memory tuning with tools such as Instruments.
- Familiarity with cloud message APIs and push notifications
- Proficient understanding of code versioning tools such as Git and SVN
- Experience in Payment Integration, Push Notification & Third Party Integration.
- Experienced with Apple Approval developed and Distribution Process, AdHoc Enterprise distribution.
- Worked on various architecture such as MVC, MVVM, Singleton, Delegate and Notification patterns.
- Good to knowledge /experience in developing GUI for C5 voip applications.
- Good to knowledge / on webrtc, various voip standards.
• Strong experience as a Java/J2EE development is required
• Excellent working knowledge in SPRING MVC, SPRING BOOT,
• Strong background in developing and deploying software that runs in a real-time, multi-threaded environment
• Good knowledge and experience with concepts of MVC, JDBC and RESTful API Integration
• Experience with threaded and asynchronous environment
• Experience with any of the following Frameworks is Desired: Spring, Spring Boot, Hibernate
• Fundamental understanding of design patterns
• Working knowledge of SOAP/XML/WSDL
• Proven experience in MongoDB
• Experience supporting and troubleshooting problems in a highly complex environment
• Familiar with agile / scrum development methodologies
• Proficient understanding of Code version tools like Git/ Bitbucket and SVN
• Familiarity with Continuous Integration and tools such as Maven and Jenkins.
Role & Responsibilities:
• You will be responsible for Java development and building large scale applications that are high performance, scalable, and resilient in an SOA environment
• Working closely with end-users and other members of the team to identify and employ the best solutions
• Developing and implementing strong algorithms/techniques for solving problems in a high-volume, high-availability environment
• Engaging end-users to identify new requirements, strategic direction and highlight issues
• Defining and building maintainable processes that provide resilient and stable platforms, which support end user’s business/technical demand
• Integrating new services and providing clean APIs and services for applications
• Understanding volume growth to ensure the systems







