11+ Video compression Jobs in Hyderabad | Video compression Job openings in Hyderabad
Apply to 11+ Video compression Jobs in Hyderabad on CutShort.io. Explore the latest Video compression Job opportunities across top companies like Google, Amazon & Adobe.
Expert knowledge and experience in video compression standards, such as H.265, H.264, HEVC, VVC, VP9, AV1
Experience in hardware encoding
Experience in streaming technologies like RTSP, RTMP, SRT, Transport over UDP/TCP
Experience in hardware video I/O will be an added advatage
Experienced in assessing visual quality using both objective metrics and subjective techniques
Excellent software design and debugging skills and solid programming skills in C/C++ Good written and oral communication skills
Familiarity with video processing algorithms such as scaling, noise reduction, tone mapping, etc would be a plus
Familiarity with the latest computer vision and deep learning technologies would be a plus

Global Digital Transformation Solutions Provider
SENIOR DATA ENGINEER:
ROLE SUMMARY:
Own the design and delivery of petabyte-scale data platforms and pipelines across AWS and modern Lakehouse stacks. You’ll architect, code, test, optimize, and operate ingestion, transformation, storage, and serving layers. This role requires autonomy, strong engineering judgment, and partnership with project managers, infrastructure teams, testers, and customer architects to land secure, cost-efficient, and high-performing solutions.
RESPONSIBILITIES:
- Architecture and design: Create HLD/LLD/SAD, source–target mappings, data contracts, and optimal designs aligned to requirements.
- Pipeline development: Build and test robust ETL/ELT for batch, micro-batch, and streaming across RDBMS, flat files, APIs, and event sources.
- Performance and cost tuning: Profile and optimize jobs, right-size infrastructure, and model license/compute/storage costs.
- Data modeling and storage: Design schemas and SCD strategies; manage relational, NoSQL, data lakes, Delta Lakes, and Lakehouse tables.
- DevOps and release: Establish coding standards, templates, CI/CD, configuration management, and monitored release processes.
- Quality and reliability: Define DQ rules and lineage; implement SLA tracking, failure detection, RCA, and proactive defect mitigation.
- Security and governance: Enforce IAM best practices, retention, audit/compliance; implement PII detection and masking.
- Orchestration: Schedule and govern pipelines with Airflow and serverless event-driven patterns.
- Stakeholder collaboration: Clarify requirements, present design options, conduct demos, and finalize architectures with customer teams.
- Leadership: Mentor engineers, set FAST goals, drive upskilling and certifications, and support module delivery and sprint planning.
REQUIRED QUALIFICATIONS:
- Experience: 15+ years designing distributed systems at petabyte scale; 10+ years building data lakes and multi-source ingestion.
- Cloud (AWS): IAM, VPC, EC2, EKS/ECS, S3, RDS, DMS, Lambda, CloudWatch, CloudFormation, CloudTrail.
- Programming: Python (preferred), PySpark, SQL for analytics, window functions, and performance tuning.
- ETL tools: AWS Glue, Informatica, Databricks, GCP DataProc; orchestration with Airflow.
- Lakehouse/warehousing: Snowflake, BigQuery, Delta Lake/Lakehouse; schema design, partitioning, clustering, performance optimization.
- DevOps/IaC: Terraform with 15+ years of practice; CI/CD (GitHub Actions, Jenkins) with 10+ years; config governance and release management.
- Serverless and events: Design event-driven distributed systems on AWS.
- NoSQL: 2–3 years with DocumentDB including data modeling and performance considerations.
- AI services: AWS Entity Resolution, AWS Comprehend; run custom LLMs on Amazon SageMaker; use LLMs for PII classification.
NICE-TO-HAVE QUALIFICATIONS:
- Data governance automation: 10+ years defining audit, compliance, retention standards and automating governance workflows.
- Table and file formats: Apache Parquet; Apache Iceberg as analytical table format.
- Advanced LLM workflows: RAG and agentic patterns over proprietary data; re-ranking with index/vector store results.
- Multi-cloud exposure: Azure ADF/ADLS, GCP Dataflow/DataProc; FinOps practices for cross-cloud cost control.
OUTCOMES AND MEASURES:
- Engineering excellence: Adherence to processes, standards, and SLAs; reduced defects and non-compliance; fewer recurring issues.
- Efficiency: Faster run times and lower resource consumption with documented cost models and performance baselines.
- Operational reliability: Faster detection, response, and resolution of failures; quick turnaround on production bugs; strong release success.
- Data quality and security: High DQ pass rates, robust lineage, minimal security incidents, and audit readiness.
- Team and customer impact: On-time milestones, clear communication, effective demos, improved satisfaction, and completed certifications/training.
LOCATION AND SCHEDULE:
● Location: Outside US (OUS).
● Schedule: Minimum 6 hours of overlap with US time zones.
Job Summary:
We are looking for a skilled and motivated .NET Full Stack Developer with strong expertise in .NET Core, React, and Microservices architecture. The ideal candidate will be responsible for designing, developing, and maintaining scalable, high-performance applications while collaborating with cross-functional teams.
Key Responsibilities:
- Design, develop, and maintain web applications using .NET Core / ASP.NET Core and React.js
- Build and implement microservices-based architecture for scalable systems
- Develop and consume RESTful APIs
- Collaborate with UI/UX designers to implement responsive and user-friendly interfaces
- Ensure code quality through unit testing, code reviews, and best practices
- Work with databases such as SQL Server / NoSQL databases
- Optimize applications for maximum speed and scalability
- Participate in Agile ceremonies like sprint planning, stand-ups, and retrospectives
- Troubleshoot and debug production issues
Required Skills:
- Strong experience in C#, .NET Core, ASP.NET Core
- Hands-on experience with React.js, JavaScript, HTML, CSS
- Solid understanding of Microservices architecture
- Experience in building and consuming REST APIs
- Knowledge of Entity Framework / ORM tools
- Experience with SQL Server / PostgreSQL / MongoDB
- Familiarity with Git / version control systems
- Understanding of design patterns and clean architecture
Position Summary:
As a CRM ETL Developer, you will be responsible for the analysis, transformation, and integration of data from legacy and external systems into CRM application. This includes developing ETL/ELT workflows, ensuring data quality through cleansing and survivorship rules, and supporting daily production loads. You will work in an Agile environment and play a vital role in building scalable, high-quality data integration solutions.
Key Responsibilities:
- Analyze data from legacy and external systems; develop ETL/ELT pipelines to ingest and process data.
- Cleanse, transform, and apply survivorship rules before loading into the CRM platform.
- Monitor, support, and troubleshoot production data loads (Tier 1 & Tier 2 support).
- Contribute to solution design, development, integration, and scaling of new/existing systems.
- Promote and implement best practices in data integration, performance tuning, and Agile development.
- Lead or support design reviews, technical documentation, and mentoring of junior developers.
- Collaborate with business analysts, QA, and cross-functional teams to resolve defects and clarify requirements.
- Deliver working solutions via quick POCs or prototypes for business scenarios.
Technical Skills:
- ETL/ELT Tools: 5+ years of hands-on experience in ETL processes using Siebel EIM.
- Programming & Databases: Strong SQL & PL/SQL development; experience with Oracle and/or SQL Server.
- Data Integration: Proven experience in integrating disparate data systems.
- Data Modelling: Good understanding of relational, dimensional modelling, and data warehousing concepts.
- Performance Tuning: Skilled in application and SQL query performance optimization.
- CRM Systems: Familiarity with Siebel CRM, Siebel Data Model, and Oracle SOA Suite is a plus.
- DevOps & Agile: Strong knowledge of DevOps pipelines and Agile methodologies.
- Documentation: Ability to write clear technical design documents and test cases.
Soft Skills & Attributes:
- Strong analytical and problem-solving skills.
- Excellent communication and interpersonal abilities.
- Experience working with cross-functional, globally distributed teams.
- Proactive mindset and eagerness to learn new technologies.
- Detail-oriented with a focus on reliability and accuracy.
Preferred Qualifications:
- Bachelor’s degree in Computer Science, Information Systems, or a related field.
- Experience in Tier 1 & Tier 2 application support roles.
- Exposure to real-time data integration systems is an advantage.
Job Overview
Experienced Coupa Implementation and Configuration Consultant to lead and support end-to-end implementations of the Coupa Business Spend Management (BSM) platform, including Supplier Information Management (SIM).
The ideal candidate will possess strong Procure-to-Pay (P2P), Source-to-Contract (S2C), and financial process expertise, along with hands-on Coupa configuration and ERP integration experience.
This role requires close collaboration with Finance, Procurement, IT teams, and executive stakeholders to deliver scalable, compliant, and optimized spend management solutions.
Key Responsibilities
Implementation & Roll-Out
- Lead full lifecycle implementation of Coupa BSM modules.
- Drive Business Process Design workshops and requirement gathering.
- Manage global or multi-entity roll-outs.
- Conduct SIT, UAT, and go-live support.
Configuration & Technical Expertise
- Configure Procurement, Sourcing, Contracts, Catalogues, Invoicing, and Expenses modules.
- Manage Supplier Information Management (SIM) and onboarding workflows.
- Configure PR, PO, Receipt, and Invoicing lifecycle.
- Implement approval workflows, compliance controls, and security configurations.
- Handle advanced system configurations and policy enforcement.
Integration & Technical
- Lead API-based integrations between Coupa and ERP systems (SAP / Oracle / Workday, etc.).
- Support data migration, reconciliation, and validation.
- Ensure system performance and compliance alignment.
Reporting & Governance
- Enable spend visibility through dashboards and analytics.
- Support audit controls and procurement governance frameworks.
Required Skills
- Strong hands-on experience in Coupa BSM implementation.
- Expertise in P2P and S2C processes.
- Experience in Supplier Information Management (SIM).
- ERP integration exposure (API-based preferred).
- Business process design and documentation capability.
- Experience in enterprise or multi-country roll-outs.
- Strong stakeholder management skills.
- Coupa certification is mandate
Looking for technical lead in .Net who are having good experience in .net Domain with cloud platform along with data structure and algorithms.
looking only for immediate joiners in Hyderabad region.
Strategic planning
Coaching and consulting leadership about HR matters
Building a competitive organization
Being a company culture and employee experience champion
Details:
- Should have Experience of Java/J2EE Developer
- Must have Full Stack Development experience.
- Must have SQL Knowledge and a good understanding of Procedures.
- Need 4 or 6 Years of Experience Candidates.
- Should be from a Product Based Company or should have 1 Year of Prior Experience in Product Based Companies.
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.
Responsibilities & ownership
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
- Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
- Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
- Review and influence designs of other team members
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Partner with other leaders to nurture innovation and engineering excellence in the team
- Drive priorities with others to facilitate timely accomplishments of business objectives
- Perform RCA of customer issues and drive investments to avoid similar issues in future
- Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
- Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 15+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
- 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
- Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
- Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
- Ability to anticipate and propose plan/design changes based on changing requirements
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Hands-on experience of working projects on AWS, Azure, and GCP
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and GCP)
- Understanding of distributed file systems such as S3, ADLS or HDFS
- Excellent communication skills and affinity for collaboration and teamwork

Knowledge Hut Solutions Pvt.Ltd.An edu-tech company(product)
Job Description
We are looking for an experienced and talented UI designer to design and shape unique, user-centric products and experiences. You will be able to make deliberate design decisions and to translate any given user-experience journey into a smooth and intuitive interaction. The ideal candidate should have experience of working in agile teams, with developers and UX designers.
- Can work independently on the Android Development platform
- Must have knowledge of both Java and Kotlin
- Good understanding of Architecture such as MVVM and MVP.
- Must have at least 3 Good quality Android apps in the portfolio to showcase

