11+ EDA Jobs in Hyderabad | EDA Job openings in Hyderabad
Apply to 11+ EDA Jobs in Hyderabad on CutShort.io. Explore the latest EDA Job opportunities across top companies like Google, Amazon & Adobe.
2.Hands-on experience on technology nodes like 7nm, 14nm, 10nm.
3.Good knowledge of EDA tools from Synopsys , Cadence and Mentor
4.Hands-on experience in floor planning, placement optimizations, CTS and routing.
5.Hands-on experience in cadence or Synopsys tool (Encounter, ICC, PT/PTSI, TEMPUS, DC, RC, VOLTAS)
At Loyalty Juggernaut, we’re on a mission to revolutionize customer loyalty through AI-driven SaaS solutions. We are THE JUGGERNAUTS, driving innovation and impact in the loyalty ecosystem with GRAVTY®, our SaaS Product that empowers multinational enterprises to build deeper customer connections. Designed for scalability and personalization, GRAVTY® delivers cutting-edge loyalty solutions that transform customer engagement across diverse industries including Airlines, Airport, Retail, Hospitality, Banking, F&B, Telecom, Insurance and Ecosystem.
Our Impact:
- 400+ million members connected through our platform.
- Trusted by 100+ global brands/partners, driving loyalty and brand devotion worldwide.
Proud to be a Three-Time Champion for Best Technology Innovation in Loyalty!!
Explore more about us at www.lji.io.
What you will OWN:
- Build the infrastructure required for optimal extraction, transformation, and loading of data from various sources using SQL and AWS ‘big data’ technologies.
- Create and maintain optimal data pipeline architecture.
- Identify, design, and implement internal process improvements, automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Work with stakeholders, including the Technical Architects, Developers, Product Owners, and Executives, to assist with data-related technical issues and support their data infrastructure needs.
- Create tools for data management and data analytics that can assist them in building and optimizing our product to become an innovative industry leader.
You would make a GREAT FIT if you have:
- Have 2 to 5 years of relevant backend development experience, with solid expertise in Python.
- Possess strong skills in Data Structures and Algorithms, and can write optimized, maintainable code.
- Are familiar with database systems, and can comfortably work with PostgreSQL, as well as NoSQL solutions like MongoDB or DynamoDB.
- Hands-on experience using Cloud Dataware houses like AWS Redshift, GBQ, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift, and AWS Batch would be an added advantage.
- Have a solid understanding of ETL processes and tools and can build or modify ETL pipelines effectively.
- Have experience managing or building data pipelines and architectures at scale.
- Understand the nuances of data ingestion, transformation, storage, and analytics workflows.
- Communicate clearly and work collaboratively across engineering, product.
Why Choose US?
- This opportunity offers a dynamic and supportive work environment where you'll have the chance to not just collaborate with talented technocrats but also work with globally recognized brands, gain exposure, and carve your own career path.
- You will get to innovate and dabble in the future of technology -Enterprise Cloud Computing, Blockchain, Machine Learning, AI, Mobile, Digital Wallets, and much more.
Job Description
Position Title: Senior System Engineer
Position Type: Full Time
Department: RSG
Reports to: First Level Manager, Indian Development Centre
Company Background:
Cglia is a software development company building highly available, highly secure, cloud-based enterprise software products that helps speed the research process resulting in new drugs, new devices, and new treatments to improve the health and wellbeing of world population.
At Cglia, our work shows our dedication and passion for innovative quality software products that are intuitive and easy to use and exceeds every aspect of customer expectations.
Cglia, is the place that develops world-class professionals who would like to be innovative, creative, learn continuously, and build a solid foundation to build products that are special and delight the customer.
Job Description:
The Senior System Engineer will have expertise in managing both Linux and Windows environments, along with hands-on experience in containerization technologies such as Kubernetes and Docker. Proficiency in Ansible for automation and configuration management is essential. This role is critical in ensuring the seamless operation, deployment, and maintenance of our IT infrastructure.
The ideal candidate has to oversee and participate with the installation, monitoring, maintenance, support, optimization and documentation of all network hardware and software. This includes managing multiple projects, planning network technology roadmaps and configuring/optimizing network services both internally and those integrated with Internet-based services
Job Responsibilities:
· Manage, maintain, and monitor Linux and Windows servers to ensure high availability and performance.
· Perform system upgrades, patches, and performance tuning for both operating systems and DBA servers.
· Deploy, manage, and troubleshoot containerized applications using Kubernetes and Docker.
· Design and implement Kubernetes clusters to ensure scalability, security, and reliability.
· Develop and maintain Ansible playbooks for automation of repetitive tasks, configuration management,
and system provisioning.
· Implement security best practices for both Linux and Windows environments.
· Set up and manage backup and disaster recovery solutions for critical systems and data.
· Work closely with development teams to support CI/CD pipelines and troubleshoot application issues.
· Manage VM Ware in a high availability environment with Disaster Recovery
· Good experience in RAID & Firewall
· Maintaining and managing SQL database server support
· Experience with scripting languages Unix/Shell, Bash or PowerShell
· Assist Quality Assurance with testing program changes, new releases or user documentation and support
new product release activities that include testing customer flows
· Must have the ability to work a flexible schedule and is required to participate in on-call rotation, which
includes different shift timings, weekends, and holidays
· Work across multiple time zones with remote team members
· Perform other duties as deemed necessary to provide quality service to the clients
Experience and Skills Required:
· Minimum 4+ years of experience in Linux and Windows administration
· 3 years of experience in VM Ware in a high availability environment with Disaster Recovery
· Good experience in RAID & Firewall
· 2+ years of experience in SQL database server support
· Ability to quickly acquire an in-depth knowledge of multiple custom applications
· Experience in setting up IT policies based on best practices and monitoring them
· Experience in shell scripting and automating tasks
· Experience in hardware and software monitoring tools
· Experience in administration and best practices for Apache and Tomcat
· Experience in handling Cisco router and firewall configurations and management
· Working knowledge on SQL Server, Oracle and other RDBMS databases
· Must be proactive and possess strong interpersonal, communication and organization skills
· Must possess excellent written and verbal presentation skills
· Must be self-motivated
· Certification in Linux/Windows administration is preferable.
Academics:
· Bachelor's / Master's degree (or equivalent) in computer science or related field or equivalent experience.
Manage support operations and lead the support engineering team.
Minimum Qualification: Bachelor's degree in Computer Science, Information Technology, or any related field. Strong technical skills and leadership experience are required.
Experience: 4+ years of experience in Software Engineering or technical support roles.
How to Apply:
- Login to tacoi.paromint.com.
- Navigate to your Profile.
- Copy your wallet public address from the app.
- Send an email.
- Include your wallet public address in the email.
- Mention the job code SM004 in the email.
- Attach your resume to the email.
Note: Candidates will be selected for interview based on their hedging ability, especially in options and commodity derivatives on tacoi.paromint.com.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
We have a requirement for Collibra Developer
Experience required- 5-12 yrs
Having experience in Data Governence , Data Quality management
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.





