8+ Real time media streaming Jobs in India
Apply to 8+ Real time media streaming Jobs on CutShort.io. Find your next job, effortlessly. Browse Real time media streaming Jobs and apply today!
Amagi Media Labs is rapidly disrupting the TV broadcasting Industry as we know it. Founded in 2008, Amagi is now the largest TV AD Network with over 30 million advertising seconds and more than 400 TV channels working on Amagi’s Cloud Infrastructure and managed playout servers. To know more about what Amagi does please visit http://www.amagi.com">www.amagi.com
Technical Program Manager Job Description
Amagi is looking to expand its project/program management team responsible for managing end to end development and delivery of various ongoing software development programs and initiatives within Amagi. We are looking for people who are process oriented, have a strong technology background, who have experience in executing and managing large software projects in the past. The software development projects under execution are largely in the intersection of media and cloud technology domains.
Responsibilities
- End-to-end ownership of multiple short term and long term product development and delivery initiatives for Amagi's key products
- Working closely with cross-functional teams like engineering, product, quality assurance, operations and customer success teams.
- Articulating and defining program milestones for successful and timebound completion of initiatives by developing project plans in collaboration with all the stakeholders.
- Identifying and resolving process bottlenecks, dependencies and related challenges
- Enterprise program management : Apart from standard product offerings, for enterprise customers, Amagi offers tighter integration and customization of their workflows over the core product offerings. In this role you one manage end to end onboarding and going live for such enterprise customers. This involves
- Working with pre-sales/sales/onboarding/customer to get the requirements
- Prepare a detailed project plan & track it across teams; keep the stakeholders (Sales, ONB, Customer) updated on status and milestones
- Work with Enterprise Engagement Team in Broadcast Engineering closely to deliver according to the plan
- Get the customer trained on the platform before going live
- Formally hand off to support once the customer has gone live
- Create, publish and maintain well organised, comprehensive data and information about the initiatives to enable organization wide transparency and reporting
- Developing dashboards and success metrics to provide visibility to various stakeholders.
- Ensuring consistent and clear status reporting
- Prepare presentations and provide supporting data as needed
- Managing one or more programs involving product development initiatives belonging to various categories, like
- Cutting edge cloud technology intensive SaaS products
- Software solution involving media technology (eg, audio/video streaming)
- Web frontend and backend related products (built on latest web frameworks)
- Setting up processes, actively participating in improving and maintaining processes for product development and delivery initiatives to enable efficiency across organization
- Based on agile methodologies and project management best practices
- Training and evangelizing the participating teams on agile processes
- Managing executive stakeholders
- Helping the program management team as a whole to publish metrics, dashboards to bring visibility of various programs to the executive team.
- Helping the executive team to take directional calls on new and ongoing programs.
Desired Skills and Experience
We'll need you to have a good foundation for establishing technical and non-technical enablement needs for internal and external teams.
Required Skills:
- Communication skills: The candidate must be good at business communication, good at using all modern tools at disposal, be able to organize and lead meetings as and when required.
- Technical Knowledge: The candidate must have a strong technical background and must be well aware of software engineering technology stacks popularly used in technology heavy companies. The candidate must be able to write and present tech heavy documentation.
- Process Knowledge: The candidate must have experience in running process oriented programs and projects, must have knowledge of agile frameworks, and should be able to set up agile processes for existing and new product development initiatives.
- Data driven acumen: The candidate must be able to do qualitative and quantitative analysis on available data and metrics to take up calls and set directions for the team.
- Relationship management: The candidate must be able to maintain a good working relationship with various cross functional team members and key stakeholders.
Required Experience:
- Experience in running scrum ceremonies and managing related artifacts
- Hands on experience in project management tools (e.g. Jira)
- Domain experience: Prior experience in media technologies like OTT, audio/video technologies, AWS/GCP cloud based application or web application development
Formal Qualifications:
- Certifications (Preferable): Agile certifications - CSM® / PMI-ACP® / SaFE®
- Preferred Work Experience: 2 - 5 years
- Enterprise Program managers: 5-7 years in Project/Program management
- Educational Qualifications: BE/BTech + MBA (From Tier I/II institutes)
Job Description:
We are looking for an exceptional Data Scientist Lead / Manager who is passionate about data and motivated to build large scale machine learning solutions to shine our data products. This person will be contributing to the analytics of data for insight discovery and development of machine learning pipeline to support modeling of terabytes of daily data for various use cases.
Location: Pune (Initially remote due to COVID 19)
*****Looking for someone who can start immediately / Within a month. Hands-on experience in Python programming (Minimum 5 Years) is a must.
About the Organisation :
- It provides a dynamic, fun workplace filled with passionate individuals. We are at the cutting edge of advertising technology and there is never a dull moment at work.
- We have a truly global footprint, with our headquarters in Singapore and offices in Australia, United States, Germany, United Kingdom and India.
- You will gain work experience in a global environment. We speak over 20 different languages, from more than 16 different nationalities and over 42% of our staff are multilingual.
Qualifications:
• 8+ years relevant working experience
• Master / Bachelors in computer science or engineering
• Working knowledge of Python and SQL
• Experience in time series data, data manipulation, analytics, and visualization
• Experience working with large-scale data
• Proficiency of various ML algorithms for supervised and unsupervised learning
• Experience working in Agile/Lean model
• Experience with Java and Golang is a plus
• Experience with BI toolkit such as Tableau, Superset, Quicksight, etc is a plus
• Exposure to building large-scale ML models using one or more of modern tools and libraries such as AWS Sagemaker, Spark ML-Lib, Dask, Tensorflow, PyTorch, Keras, GCP ML Stack
• Exposure to modern Big Data tech such as Cassandra/Scylla, Kafka, Ceph, Hadoop, Spark
• Exposure to IAAS platforms such as AWS, GCP, Azure
Typical persona: Data Science Manager/Architect
Experience: 8+ years programming/engineering experience (with at least last 4 years in Data science in a Product development company)
Type: Hands-on candidate only
Must:
a. Hands-on Python: pandas,scikit-learn
b. Working knowledge of Kafka
c. Able to carry out own tasks and help the team in resolving problems - logical or technical (25% of job)
d. Good on analytical & debugging skills
e. Strong communication skills
Desired (in order of priorities)
a.Go (Strong advantage)
b. Airflow (Strong advantage)
c. Familiarity & working experience on more than one type of database: relational, object, columnar, graph and other unstructured databases
d. Data structures, Algorithms
e. Experience with multi-threaded and thread sync concepts
f. AWS Sagemaker
g. Keras
JD for IOT DE:
The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.
You Have:
- Minimum 2 years of software development experience
- Minimum 2 years of experience in IoT/streaming data pipelines solution development
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
- Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
- Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
- Hands-on experience on containers and Dockers
- Exposure to streaming protocols like MQTT and AMQP
- Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
- Strong knowledge of continuous integration, static code analysis, and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Roles & Responsibilities
You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
Metadata Technologies, North America
We are looking for an exceptional Software Developer for our Data Engineering India team who can-
contribute to building a world-class big data engineering stack that will be used to fuel us
Analytics and Machine Learning products. This person will be contributing to the architecture,
operation, and enhancement of:
Our petabyte-scale data platform with a key focus on finding solutions that can support
Analytics and Machine Learning product roadmap. Everyday terabytes of ingested data
need to be processed and made available for querying and insights extraction for
various use cases.
About the Organisation:
- It provides a dynamic, fun workplace filled with passionate individuals. We are at the cutting edge of advertising technology and there is never a dull moment at work.
- We have a truly global footprint, with our headquarters in Singapore and offices in Australia, United States, Germany, United Kingdom, and India.
- You will gain work experience in a global environment. We speak over 20 different languages, from more than 16 different nationalities and over 42% of our staff are multilingual.
Job Description
Position:
Software Developer, Data Engineering team
Location: Pune(Initially 100% Remote due to Covid 19 for coming 1 year)
- Our bespoke Machine Learning pipelines. This will also provide opportunities to
contribute to the prototyping, building, and deployment of Machine Learning models.
You:
- Have at least 4+ years’ Experience.
- Deep technical understanding of Java or Golang.
- Production experience with Python is a big plus, extremely valuable supporting skill for
us.
- Exposure to modern Big Data tech: Cassandra/Scylla, Kafka, Ceph, the Hadoop Stack,
Spark, Flume, Hive, Druid etc… while at the same time understanding that certain
problems may require completely novel solutions.
- Exposure to one or more modern ML tech stacks: Spark ML-Lib, TensorFlow, Keras,
GCP ML Stack, AWS Sagemaker - is a plus.
- Experience includes working in Agile/Lean model
- Experience with supporting and troubleshooting large systems
- Exposure to configuration management tools such as Ansible or Salt
- Exposure to IAAS platforms such as AWS, GCP, Azure…
- Good addition - Experience working with large-scale data
- Good addition - Good to have experience architecting, developing, and operating data
warehouses, big data analytics platforms, and high velocity data pipelines
**** Not looking for a Big Data Developer / Hadoop Developer
Mid / Senior Big Data Engineer
Job Description:
Role: Big Data EngineerNumber of open positions: 5Location: PuneAt Clairvoyant, we're building a thriving big data practice to help enterprises enable and accelerate the adoption of Big data and cloud services. In the big data space, we lead and serve as innovators, troubleshooters, and enablers. Big data practice at Clairvoyant, focuses on solving our customer's business problems by delivering products designed with best in class engineering practices and a commitment to keep the total cost of ownership to a minimum.
Must Have:
- 4-10 years of experience in software development.
- At least 2 years of relevant work experience on large scale Data applications.
- Strong coding experience in Java is mandatory
- Good aptitude, strong problem solving abilities, and analytical skills, ability to take ownership as appropriate
- Should be able to do coding, debugging, performance tuning and deploying the apps to Prod.
- Should have good working experience on
- o Hadoop ecosystem (HDFS, Hive, Yarn, File formats like Avro/Parquet)
- o Kafka
- o J2EE Frameworks (Spring/Hibernate/REST)
- o Spark Streaming or any other streaming technology.
- Strong coding experience in Java is mandatory
- Ability to work on the sprint stories to completion along with Unit test case coverage.
- Experience working in Agile Methodology
- Excellent communication and coordination skills
- Knowledgeable (and preferred hands on) - UNIX environments, different continuous integration tools.
- Must be able to integrate quickly into the team and work independently towards team goals
- Take the complete responsibility of the sprint stories' execution
- Be accountable for the delivery of the tasks in the defined timelines with good quality.
- Follow the processes for project execution and delivery.
- Follow agile methodology
- Work with the team lead closely and contribute to the smooth delivery of the project.
- Understand/define the architecture and discuss the pros-cons of the same with the team
- Involve in the brainstorming sessions and suggest improvements in the architecture/design.
- Work with other team leads to get the architecture/design reviewed.
- Work with the clients and counter-parts (in US) of the project.
- Keep all the stakeholders updated about the project/task status/risks/issues if there are any.
Experience: 4 to 9 years
Keywords: java, scala, spark, software development, hadoop, hive
Locations: Pune