
Note: Since we are a media company, the role will require only media technical skills and you will be working on media technical operations. This role does not need any software testing or software production skills.
Primary Responsibilities
▪ To perform 100% linear watch-through for Audio, Video, & Subtitle content for a variety
of clients and formats (SD, Blu-ray, etc.) and ensure that these assets are free of any
errors.
▪ To carry out menu quality testing and functionality logic testing for a variety of clients.
▪ Report and document issues found using project-related checklists and internal
database.
▪ Communicating with offload units about project status, issues found, etc.
▪ Send daily updates of work to the team.
▪ Communicate and interact with the Production Manager, Authoring team, offloading
units, and associated departments regarding issues related to the project.
▪ Update OMS and time tracker on a project-to-project basis.
▪ Ensure on-time and accurate completion of projects.
Qualifications
Education
▪ 10+3 or equivalent
▪ Experience
- Freshers
▪ Skills
- Good knowledge of Office packages such as Word, Excel, etc.
- Excellent written and oral communication skills.
- Eye for detail in noticing issues, both minor and major, with respect to menu design,
audio, video & subtitles and functionality logic.
- Exposure to game/software testing is a plus.
- Knowledge of Adobe Photoshop, Premier, or other video/audio SW a plus.
- Exposure to AV equipment and other home-entertainment hardware is a plus.
- Must be comfortable with a computer and in tune with the latest technologies.
- Understanding of graphics, imagery and sound specifications, such as colours,
resolution, audio glitches etc, a plus.
- A dedicated and well-rounded individual with the ability to work well under pressure as
part of a team with tight deadlines.
- A keen interest in media and multimedia is required.
- Good eyesight and no obvious hearing defects.
- Keen interest in working with different types of software and hardware is required.
- Desire to learn new concepts and cross-train with other departments.
- An ability to learn quickly and effectively.
- Ability to solve problems under pressure.
- Good judgement and logical/analytical skills.

About DigiCaptions India
About
Connect with the team
Company social profiles
Similar jobs

Requirements
- Strong knowledge of Javascript fundamentals, Object-oriented programming, and Web concepts with 3-7 years of relevant experience in building web apps at scale.
- You must have a strong understanding of semantic HTML / HTML5, CSS / CSS3.
- Experience or familiarity with React.js or Vue.s, and build tools like Webpack, Rollup, etc.
- Curiosity to go in-depth of codebases and libraries and have the ability to set it up from scratch.
- Familiarity with Server Side Rendering (SSR) and website SEO
What You’ll Do
- Build website and frontend platform.
- Build monitoring graphs, services directory, ML model management UI. Make sure SEO and website performance are top-notch.
- Work with design teams and ability to lead other frontend engineers when we expand our team
- Take good design decisions which scale and follow best practices.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Hiring for Lead Auditor (QMS / ISMS) role.
Job description Below :
- Preparation Dept. Objective reports.
- Preparation of Internal Audit Schedule & Coordination /opening meeting and closing meeting.
- Follow up for Internal Audit closing of Observations.
- Preparation of Management review meeting Input & Output reports.
- Coordination Certification Audit and Surveillance (TUV-SUD) Audit for ISO 9001:2015 and ISO 27001:2013.
- Follow up for closing of Observations.
- To update QMS & ISMS Manual, Procedures, Policies, Risk Assessment Plan, SOA & Formats.
- Internal Audit of ISO - QMS & ISMS standards.


Required Skills:
- Fluency in any one of JavaScript, TypeScript, or Python.
- Strong problem-solving skills.
- Should have built large scalable enterprise applications from scratch.
- Strong experience in architectural patterns, High-level designs.
- Experience in NoSQL and SQL DBs.
- You have a knack for launching and iterating on products quickly with quality and efficiency
- Willingness to learn and ability to flourish in a dynamic, high-growth, entrepreneurial environment
- Hands-on, self-starter, capable of working independently
- True love for technology and what you do
- Maniacal attention to detail
- 3+ years of experience
The expectation is to set up complete automation of CI/CD pipeline & monitoring and ensure high availability of the pipeline. The automated deployment environment can be on-prem or cloud (virtual instances, containerized and serverless). Complete test automation and ensure Security of Application as well as Infrastructure.
ROLES & RESPONSIBILITIES
Configure Jenkins with load distribution between master/slave Setting up the CI pipeline with Jenkins and Cloud(AWS or Azure) Code Build Static test (Quality & Security) Setting up Dynamic Test configuration with selenium and other tools Setting up Application and Infrastructure scanning for security. Post-deployment security plan including PEN test. Usage of RASP tool. Configure and ensure HA of the pipeline and monitoring Setting up composition analysis in the pipeline Setting up the SCM and Artifacts repository and management for branching, merging and archiving Must work in Agile environment using ALM tool like Jira DESIRED SKILLS
Extensive hands-on Continuous Integration and Continuous Delivery technology experience of .Net, Node, Java and C++ based projects(Web, mobile and Standalone). Experience configuring and managing
- ALM tools like Jira, TFS, etc.
- SCM such as GitHub, GitLab, CodeCommit
- Automation tools such as Terraform, CHEF, or Ansible
- Package repo configuration(Artifactory / Nexus), Package managers like Nuget & Chocholatey
- Database Configuration (sql & nosql), Web/Proxy Setup(IIS, Nginx, Varnish, Apache).
Deep knowledge of multiple monitoring tools and how to mine them for advanced data Prior work with Helm, Postgres, MySQL, Redis, ElasticSearch, microservices, message queues and related technologies Test Automation with Selenium / CuCumber; Setting up of test Simulators. AWS Certified Architect and/or Developer; Associate considered, Professional preferred Proficient in: Bash, Powershell, Groovy, YAML, Python, NodeJS, Web concepts such as REST APIs and Aware of MVC and SPA application design. TTD experience and quality control with Sonarqube or Checkmarx, Tics Tiobe and Coverity Thorough with Linux(Ubuntu, Debian CentOS), Docker(File/compose/volume), Kubernetes cluster setup Expert in Workflow tools: Jenkins(declarative, plugins)/TeamCity and Build Servers configuration Experience with AWS CloudFormation / CDK and delivery automation Ensure end-to-end deployments succeed and resources come up in an automated fashion Good to have ServiceNow configuration experience for collaboration
What you will get:
- To be a part of the Core-Team 💪
- A Chunk of ESOPs 🚀
- Creating High Impact by Solving a Problem at Large (No one in the World has a similar product) 💥
- High Growth Work Environment ⚙️
What we are looking for:
- An 'Exceptional Executioner' -> Leader -> Create an Impact & Value 💰
- Ability to take Ownership of your work
- Past experience in leading a team
- Proficient in Java, Node or Python
- Experience with NewRelic, Splunk, SignalFx, DataDog etc.
- Monitoring and alerting experience
- Full stack development experience
- Hands-on with building and deploying micro services in Cloud (AWS/Azure)
- Experience with terraform w.r.t Infrastructure As Code
- Should have experience troubleshooting live production systems using monitoring/log analytics tools
- Should have experience leading a team (2 or more engineers)
- Experienced using Jenkins or similar deployment pipeline tools
- Understanding of distributed architectures

Role: Data Analytics Lead / Manager
- Lead cross-functional projects using advanced data modelling and analysis techniques to discover insights that will guide strategic decisions and uncover optimization opportunities.
- Build, develop and maintain data models, reporting systems, data automation systems, dashboards and performance metrics support that support key business decisions.
- Coordinate with different teams to determine requirements for data warehousing, reporting and analytical solutions, and ensure customer satisfaction on deliverables
- Oversee analytics projects to extract, manage, and analyse data from multiple applications, ensuring that deadlines are met.
- Apply statistics and data modelling to gain actionable business insights and boost customer productivity and revenue.
- Enforce company policies and procedures to ensure quality and prevent discrepancies.
- Communicate and track key performance metrics across departments.
- Keep abreast of industry best practices and policies.
- Research Latest trends, analyse data, identify opportunities and incorporate changes into business strategies.
- Manage and optimize processes for data intake, validation, mining and engineering as well as modelling, visualization and communication deliverables.
- Examine, interpret and report results of analytical initiatives to stakeholders in leadership, technology, sales, marketing and product teams.
- Oversee the data/report requests process: tracking requests submitted, prioritization, approval, etc.
- Develop and implement MLDevOps, quality controls and departmental standards to ensure quality standards, organizational expectations, and regulatory requirements.
- Anticipate future demands of initiatives related to people, technology, budget and business within your department and design/implement solutions to meet these needs.
- Organize and drive successful completion of data insight initiatives through effective management of data analyst and effective collaboration with stakeholders.
Essential Skills
- Working knowledge of data mining principles: predictive analytics, mapping, collecting data from multiple data systems on premises and cloud-based data sources.
- Strong SQL skills, ability to perform effective querying involving multiple tables and subqueries.
- Understanding of and experience using analytical concepts and statistical techniques: hypothesis development, designing tests/experiments, analyzing data, drawing conclusions, and developing actionable recommendations for business units.
- Experience and knowledge of statistical modelling techniques: multiple regression, logistic regression, log-linear regression, variable selection, etc.
We’re looking for someone with at least 10+ years of experience in a position monitoring, managing, transforming and drawing insights from data, and someone with at least 5 years of experience leading a data analyst team.
Experience in Azure is preferred


