11+ UVM Jobs in Hyderabad | UVM Job openings in Hyderabad
Apply to 11+ UVM Jobs in Hyderabad on CutShort.io. Explore the latest UVM Job opportunities across top companies like Google, Amazon & Adobe.
1.SV, UVM, USB, DDR, PCIE, Ethernet, Axi, MIPI. Any one of the protocols will
be added advantage.
2.Experience in verification of complex IPs or SoCs.
3. Expertise in SoC Verification using C and SV/UVM.Expertise in AMBA
protocols
4. AXI/AHB/APB and experience in working with ARM Processors.
5. Expertise in Test Plan creation and Verification technologies like Code
Coverage, Functional Coverage and Assertions.
- Strong B2B Product Manager Profiles
- Mandatory (Experience 1) - Must have Total 5+ YOE with recent 2+ years as a Product Manager
- Mandatory (Experience 2) - Must have experience working in early-stage products (with early stage startups or in big startups/companies driving 0-to-1 product development and building MVPs)
- Mandatory (Experience 3) - Must have hands-on experience in writing PRDs, BRDs, user stories, and maintaining product documentation for cross-functional teams.
- Mandatory (Company) - Product companies only (B2B preferred)
Preferred
- Preferred (Experience) – Prior experience as a developer or tech role is a plus
experience in Python, Django, Flask, and related frameworks in Unix/Linux or Windows
environment.
Hands-on experience in a full-stack python application development.
Design and develop Python web applications adhering to microservices framework considering
performance and ability to scale on demand.
Good experience working with RDBMS (Relational Database Management System).
Experience with Microservices architecture, containers & Docker based applications. Experience in
developing web applications and REST APIs using Flask/Django Framework (JSON, XML, etc.)
Experience working on Apache HTTP/HTTPS or any other webapp servers.
Package code and create executables/binaries in python.
Experience in Pandas, SciPy, NumPy, Pandas libraries.
Strong unit testing and debugging skills good understanding of threading limitations of Python, and
multi-process architecture.
Experience in managing the Source Code Base through Version Control tool like SVN, Git, and
Bitbucket, etc.
Thorough Understanding of OOPS concepts.
Experience working in an Agile development environment.
Good understanding of database (PostgreSQL/MySQL/Oracle/SQL Server).
Good communication and organization skills, with a logical approach to problem solving, good time
management, and task prioritization skills.
Requirements: Skills and Qualifications
8-10 years of experience in Python, Django, Flask, and related frameworks in Unix/Linux or Windows
environment, preferably in banking domain.
Language: Python
Frameworks: Django, Flask.
Libraries: Sqlalchemy, NumPy, SciPy and Pandas…Etc.
OS: Windows, Linux/Unix
Version Controls: Git, and Bitbucket.
Databases: MySQL, Oracle, SQL Server, PostgreSQL.
Containers: Docker
Note - Urgently looking for people serving Notice Period Or Immediate Joiners
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes
1. Core Responsibilities
· Leading solutions for data engineering
· Maintain the integrity of both the design and the data that is held within the architecture
· Champion and educate people in the development and use of data engineering best practises
· Support the Head of Data Engineering and lead by example
· Contribute to the development of database management services and associated processes relating to the delivery of data solutions
· Provide requirements analysis, documentation, development, delivery and maintenance of data platforms.
· Develop database requirements in a structured and logical manner ensuring delivery is aligned with business prioritisation and best practise
· Design and deliver performance enhancements, application migration processes and version upgrades across a pipeline of BI environments.
· Provide support for the scoping and delivery of BI capability to internal users.
· Identify risks and issues and escalate to Line / Project manager.
· Work with clients, existing asset owners & their service providers and non BI development staff to clarify and deliver work stream objectives in timescales that deliver to the overall project expectations.
· Develop and maintain documentation in support of all BI processes.
· Proactively identify cost-justifiable improvements to data manipulation processes.
· Research and promote relevant BI tools and processes that contribute to increased efficiency and capability in support of corporate objectives.
· Promote a culture that embraces change, continuous improvement and a ‘can do’ attitude.
· Demonstrate enthusiasm and self-motivation at all times.
· Establish effective working relationships with other internal teams to drive improved efficiency and effective processes.
· Be a champion for high quality data and use of strategic data repositories, associated relational model, and Data Warehouse for optimising the delivery of accurate, consistent and reliable business intelligence
· Ensure that you fully understand and comply with the organisation’s Risk Management Policies as they relate to your area of responsibility and demonstrate in your day to day work that you put customers at the heart of everything you do.
· Ensure that you fully understand and comply with the organisation’s Data Governance Policies as they relate to your area of responsibility and demonstrate in your day to day work that you treat data as an important corporate asset which must be protected and managed.
· Maintain the company’s compliance standards and ensure timely completion of all mandatory on-line training modules and attestations.
2. Experience Requirements
· 5 years Data Engineering / ETL development experience is essential
· 5 years data design experience in an MI / BI / Analytics environment (Kimball, lake house, data lake) is essential
· 5 years experience of working in a structured Change Management project lifecycle is essential
· Experience of working in a financial services environment is desirable
· Experience of dealing with senior management within a large organisation is desirable
· 5 years experience of developing in conjunction with large complex projects and programmes is desirable
· Experience mentoring other members of the team on best practise and internal standards is essential
· Experience with cloud data platforms desirable (Microsoft Azure) is desirable
3. Knowledge Requirements
· A strong knowledge of business intelligence solutions and an ability to translate this into data solutions for the broader business is essential
· Strong demonstrable knowledge of data warehouse methodologies
· Robust understanding of high level business processes is essential
· Understanding of data migration, including reconciliation, data cleanse and cutover is desirable
Qualifications:
BTech/BE in computer science, electrical, electronics or related fields 5+ years of full stack design and development experience High emotional intelligence, empathy and collaborative approach. Experience in Angular Javascript frameworks, CSS, HTML5, NodeJS, ExpressJS, MongoDB to handle full stack web development. Experience with developing rich dynamic front end applications using Angular and CSS frameworks like BulmaCSS, Angular Material, Bootstrap, etc. Knowledge of GraphQL would be an added advantage. Knowledge of Cloud services like AWS, Heroku, Azure is preferable. Should be a quick learner to keep up with the pace of the ever changing world of technology as the candidate will get excellent exposure to the latest and trending Cloud based Saas technologies and best practices while working with varied customers based across the globe.
Responsibilities:
Develop web applications covering end to end software development life cycle right from writing UI code using Angular to backend API code using NodeJS and managing databases like MongoDB, MySQL, etc. Involved in full stack code management from Git check-ins to running automated builds and deployments using DevOps practices to deploy to public cloud services like AWS, Azure, Heroku, etc. Handling full-stack web development workflow right from front end to backend to CI/CD workflows. Design and Develop the tech architecture and work closely with CEO and CTO of the company Drive and guide with work of other engineers on the team
This is a leadership role and candidate is expected to wear multiple technical hats including customer interactions and investor discussions
2. Arranging hotel booking, flight booking, and other transport bookings as per the travel itineraries in a very effective and systematic manner to optimize time slots and priority
3. Communicating with suppliers & service providers
4. Following up with customers for feedback about their experience of the tour and service
About Us:
We're on a mission to make it possible for every person, team, and company to be able to tailor their software to solve any problem and take on any challenge. Computers may be our most powerful tools, but most of us can't build or modify the software we use on them every day. At Notion, we want to change this with focus, design, and craft.
We've been working on this together since 2016, and have customers like Pixar, Mitsubishi, Figma, Plaid, Match Group, and thousands more on this journey with us. Today, we're growing fast and excited for new teammates to join us who are the best at what they do. We're passionate about building a company as diverse and creative as the millions of people Notion reaches worldwide.
About The Role:
Millions of people use Notion — and this number is increasing every day. That means a million people trust us to deliver a fast, reliable, and secure experience, and we value this more than anything. We want to keep earning trust, while also continuing to amaze our users with the tools they can build in Notion. This is where you come in — to help us forge a performant and reliable path forward to the future.
What You'll Achieve:
- Write clean, secure, tested, and documented code.
- Design & enhance the Notion platform with new capabilities, as and when the need arises.
- Contribute to monitoring & scaling the architecture of the platform and the infrastructure.
- Write technical documentation, and contribute to determining internal processes.
- Contribute to recruiting of new backend employees
Skills You'll Need to Bring:
- 6+ years of experience building scalable platforms.
- Good understanding of database (Relational Or Nosql) internals like transactions, indexes. Experience in schema design.
- 2+ years of experience working on the AWS platform, and acquaintance with technologies like Elastic Beanstalk, AWS Lambda, Elastic Load Balancer.
- Good understanding of docker internals.
- Good communication skills, good leadership skills, attention to detail, a sound understanding of algorithms, and object-oriented programming.
Nice to Haves:
- You're proficient with any part of our technology stack: React, TypeScript, Node.js, Memcached, Postgres, Docker, and Elasticsearch.
- You've heard of computing pioneers like Ada Lovelace, Douglas Engelbart, Alan Kay, and others—and understand why we're big fans of their work.
- You have interests outside of technology, such as in art, history, or social sciences.
Our customers come from all walks of life and so do we. We hire great people from a wide variety of backgrounds, not just because it's the right thing to do, but because it makes our company stronger. If you share our values and our enthusiasm for small businesses, you will find a home at Notion.
Notion is proud to be an equal opportunity employer. We do not discriminate in hiring or any employment decision based on race, color, religion, national origin, age, sex (including pregnancy, childbirth, or related medical conditions), marital status, ancestry, physical or mental disability, genetic information, veteran status, gender identity or expression, sexual orientation, or other applicable legally protected characteristic. Notion considers qualified applicants with criminal histories, consistent with applicable federal, state and local law. Notion is also committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures. If you need assistance or an accommodation made due to a disability, please let your recruiter know.
Role: Senior Campus Recruiter
Experience: 5 – 8 Years
Location: Hyderabad
Technovert is new-generation services/product technology firm that’s built on a people foundation. We have a strong heritage built on great people who put customers first and deliver exceptional results with no surprises - every time. We deliver technology solutions that are interconnection of user experience, business goals, and information technology.
Technovert is well known for innovative intern and fresher talent acquisition strategies. We nurture our talent in an open environment that allows for individual growth of a person. Some of our key leaders have grown the ranks all the way from freshers to managing some of the key business divisions. This role is thus very crucial for us.
Senior Campus Recruiter is a role providing an end-to-end graduate recruitment experience from collating the demand, strategizing the plan and executing it to make sure right hires are in place rightly. Senior Campus Recruiter in this role brings out passion of connecting Technovert to the potential talent enabling a great career start for undergraduates.
We would love you to:
- Create, strategize and execute campus recruitment plans to meet the demand
- Assess fresh talent across technical and competency-based assessments in high volume
- Explore and develop new channels to connect students with Technovert
- Advocate the great talent experience and maintain Technovert’s repute
- Carry out continuous research, create and maintain data points for analysis and reporting
- Seek the delta and continuously evolve campus hiring process at Technovert
You bring:
- Knowledge and strategy in competing and hiring best talent (Tech and Non-Tech)
- Strong relationships with good Universities / Colleges PAN India
- Broad exposure to branding fundamentals via social media
- Excellent Stakeholder Management and relationship building experience
- Proven knowledge in assessment methodologies / tools required for hiring fresh talent
- Total Experience of 7-10 years and should be interested in teaching and research
- 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
- 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
- 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
- Experience in designing, building, and maintaining ETL systems
- Experience in data pipeline and workflow management tools like Airflow
- Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
- Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
- Should be good at storytelling in Technology
Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors





