11+ Teradata Jobs in Hyderabad | Teradata Job openings in Hyderabad
Apply to 11+ Teradata Jobs in Hyderabad on CutShort.io. Explore the latest Teradata Job opportunities across top companies like Google, Amazon & Adobe.
We are looking for a Teradata developer for one of our premium clients, Kindly contact me if interested
About the job
Job Description
Managing a portfolio of SME clients, monitoring their insurance needs, and proactively engaging with them to ensure client satisfaction and retention.
Developing and maintaining strong relationships with SME clients, understanding their business needs, and providing personalized service.
Identifying and pursuing new business opportunities within the SME sector to grow the insurance company & client base and revenue.
Ensuring smooth and efficient service delivery to SME clients, addressing their queries and concerns promptly.
Managing a portfolio of SME clients, monitoring their insurance needs, and proactively engaging with them to ensure client satisfaction and retention.
Selling insurance products and services to SMEs, as well as cross-selling other relevant financial products offered by the company
Working with internal teams and customer service, to facilitate seamless service
delivery to SME clients.
Skills And Qualifications
Excellent communication, interpersonal, and negotiation skills.
Proven track record of achieving sales targets and identifying new business opportunities.
Understanding of various insurance products and policies relevant to SMEs.
Providing customer service and building long-term relationships.
Well Experienced & Familiarity with the unique needs and challenges of SME businesses.
Bachelor’s Degree.
Strong Snowflake Data Architect profile (Cloud Data Platform / AI-led Data Transformation)
Mandatory (Experience 1) – Must have 8+ years of experience in Data Engineering / Data Architecture, with strong focus on building enterprise-scale data platforms
Mandatory (Experience 2) – Must have 3+ years of deep hands-on experience in Snowflake architecture, including designing and implementing scalable data warehouse solutions
Mandatory (Experience 3) – Strong expertise in Snowflake features including Resource Monitors, RBAC, Virtual Warehouses, Time Travel, Zero Copy Clone, and query performance optimization
Mandatory (Experience 4) – Proven experience building and managing data ingestion pipelines using Snowpipe, handling structured, semi-structured (JSON, XML), and columnar data formats (Parquet)
Mandatory (Experience 5) – Strong experience in cloud ecosystem, preferably AWS, including S3, Lambda, EC2, Redshift, and integration with Snowflake-based architectures
Mandatory (Experience 6) – Proven experience in migrating data from on-premise or legacy systems to Snowflake, including data modeling, transformation, and validation
Mandatory (Experience 7) – Hands-on experience in SQL, SnowSQL, Python, or PySpark for data transformation, automation, and monitoring
Mandatory (Experience 8) – Experience in data modeling, partitioning, micro-partitions, and re-clustering strategies in Snowflake
Mandatory (Experience 9) – Must have experience working in client-facing or consulting roles, including requirement gathering, solution design, and stakeholder communication
Mandatory (Skill 1) – Strong understanding of end-to-end data architecture including ETL/ELT pipelines, data lakes, and warehouse integration
Mandatory (Skill 2) – Experience in designing monitoring and automation frameworks using Python, Bash, or similar tools
Mandatory (Skill 3) – Ability to translate business requirements into scalable technical solutions and define future-state data architecture roadmaps
Mandatory (Note) – Only immediate joiners or candidates who can join within 15 days
Strong Microsoft Fabric / Azure Data Architect profile
Mandatory (Experience 1) – Must have 8+ years of experience in Data Architecture / Data Engineering, with strong exposure to enterprise-scale data platform modernization initiatives
Mandatory (Experience 2) – Must have 3+ years of deep hands-on experience in Microsoft Fabric ecosystem including Fabric Lakehouse, OneLake, and Data Factory, with large-scale implementations
Mandatory (Experience 3) – Strong expertise in designing and implementing Medallion (Bronze/Silver/Gold) architecture and scalable lakehouse platforms supporting batch and real-time workloads
Mandatory (Experience 4) – Strong experience in Azure data ecosystem including Azure Data Factory, Azure Synapse, ADLS, and Power BI, with good understanding of cloud-native data architectures
Mandatory (Experience 5) – Proven experience designing scalable data models including Dimensional Modelling (Star/Snowflake) and/or Data Vault for enterprise data warehouses
Mandatory (Experience 6) – Must have hands-on experience building ingestion pipelines including batch, streaming, and CDC pipelines using tools like Spark, Kafka, or Fabric pipelines
Mandatory (Experience 7) – Strong experience in implementing data governance frameworks including data cataloging, lineage, metadata management, and security controls
Mandatory (Skill 1) – Proven experience in building CI/CD pipelines for data platforms using Azure DevOps / Git, including automated deployment and environment management
Mandatory (Skill 2) – Hands-on experience designing AI/ML-ready data platforms, enabling advanced analytics, machine learning, and Generative AI use cases
Mandatory (Skill 3) – Experience with orchestration and workflow tools, and integrating data platforms with BI tools like Power BI for enterprise reporting
Mandatory (Note) – Only immediate joiners (within 15 days) will be considered
1.4+ years of software development experience
2. Strong experience with Kubernetes, Docker, and CI/CD pipelines in cloud-native environments.
3. Hands-on with NATS for event-driven architecture and streaming.
4. Skilled in microservices, RESTful APIs, and containerized app performance optimization.
5. Strong in problem-solving, team collaboration, clean code practices, and continuous learning.
6. Proficient in Java (Spring Boot) and Python (Flask) for building scalable applications and APIs.
7. Focus: Java, Python, Kubernetes, Cloud-native development
We are seeking a detail-oriented and innovative Chemical Engineer. The ideal candidate will be responsible for designing, developing, and optimizing processes for manufacturing, ensuring safety, quality, sustainability, and efficiency. This role requires strong problem-solving skills and the ability to work in cross-functional teams to improve production systems.
Key Responsibilities:
- Design, develop, and implement chemical processes for production.
- Monitor and optimize plant operations to improve yield, reduce costs, and minimize waste.
- Ensure compliance with health, safety, and environmental regulations.
- Conduct research and feasibility studies to develop new products and processes.
- Collaborate with R&D, production, and quality teams to improve efficiency.
- Troubleshoot technical issues in manufacturing and propose solutions.
- Prepare and maintain technical documentation, reports, and SOPs.
- Implement best practices for sustainability and energy efficiency.
Qualifications & Skills:
- Bachelor’s or Master’s degree in Chemical Engineering (or related field).
- Strong knowledge of chemical processes, thermodynamics, fluid mechanics, and reaction engineering.
- Experience with process simulation tools (e.g., Aspen Plus, HYSYS, MATLAB) is an advantage.
- Good analytical, mathematical, and problem-solving skills.
- Strong communication and teamwork abilities.
- Knowledge of safety standards and environmental regulations.
Position: Chartered Accountant
Pls Contact to sairam.akirala
@
kiaraglobalservices.com
798
981 217 8
Key Requirements:
- Chartered Accountant (CA) qualification completed (0 to 3 years of post-qualification experience).
- Strong understanding of accounting principles, taxation laws, and financial regulations.
- Proficiency in accounting software (Tally, QuickBooks, SAP, etc.) and MS Excel.
- Excellent attention to detail and ability to work with accuracy.
- Good analytical and problem-solving skills.
- Strong communication skills and ability to work collaboratively with teams.
- Ability to meet deadlines and work under pressure.
• Analyzing/Triaging the Wi-Fi & GPS Connectivity and enhancing the Wi-Fi & GPS Features
Job requirements:
• Engineering degree
• 5-8 years of hands-on experience on Connectivity concepts and Wi-Fi Module
• Strong Knowledge on Wi-Fi Architecture and Android Framework and Firmware layer
• Strong knowledge of Wi-Fi Features, and Wi-Fi Standards
• Good understanding of Qualcomm CNSS subsystem, firmware and system level features
• Strong Knowledge on the Wi-Fi Debugging concepts & Log collection and Wlan Standards
• Ready to Debug/work in any Connectivity modules like NFC/Bluetooth/GPS
• Preferably should have experience, who is working in Qualcomm SM 6X/7X/8X series
Greetings from Delhi Public School & Pallavi Group of Schools & Colleges!
We are one of the largest Group of schools with DPS(Nacharm /Mahendra Hill /Nadergul) and Pallavi Group of schools in India. We have more than 30k students studying in our Group of Schools in Hyderabad.
We are looking for Vice-Principal or Senior Headmistress with a minimum of 5 years of experience in the CBSE curriculum and teaching currently TGT/PGT(mandatory).
Qualification: Postgraduate with M.Ed. having experience in CBSE/International school (M.Phil. Ph.D. will be an additional qualification). Excellent communication skills and leadership qualities.
Looking for female candidates who can join immediately.
Interested candidates can send their updated resume.
Regards
G Anitha
HR Manager
Delhi Public School,
Mahendra Hills/ Nachara / Nadergul
Python API DeveloperJD:
Experience: 4-6 Yrs
Notice Period: 10-20 days or within 1 month
>> Develop and maintain various security software products with queues, caching & database management.
>> Hands-on experience in Coding in Python is required along with Knowledge about Data Structures and object-oriented programming, Algorithms.
>> Extensive experience in developing asynchronous systems.
>> Integration of user-facing elements developed by front-end developers with server-side logic
>> Implementation of security and data protection.
>> Performance tuning, improvement, balancing, usability, automation
Mandatory Skills:
- Python
- Flask/Django
- API
Urgent Requirement for PHP CodeIgniter Developer
Exp : 3 to 5 Years
Location : Hyderabad
Skills Required : HTML,CSS,MYSQL,Core PHP, OOPS,MVC, API integrations
Frame works Knowledge Such as Codeignator,Yii
Knowledge On JavaScript, Jquery Customization, Basic knowledge Angualrjs
Cms Knowledge like Wordpress, Joomla
Need to have exprience on above skills




