11+ Audio codecs Jobs in Hyderabad | Audio codecs Job openings in Hyderabad
Apply to 11+ Audio codecs Jobs in Hyderabad on CutShort.io. Explore the latest Audio codecs Job opportunities across top companies like Google, Amazon & Adobe.
Expert knowledge and experience in video compression standards, such as H.265, H.264, HEVC, VVC, VP9, AV1
Experience in hardware encoding
Experience in streaming technologies like RTSP, RTMP, SRT, Transport over UDP/TCP
Experience in hardware video I/O will be an added advatage
Experienced in assessing visual quality using both objective metrics and subjective techniques
Excellent software design and debugging skills and solid programming skills in C/C++ Good written and oral communication skills
Familiarity with video processing algorithms such as scaling, noise reduction, tone mapping, etc would be a plus
Familiarity with the latest computer vision and deep learning technologies would be a plus
About Kanerika:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Role Responsibilities:
Following are high level responsibilities that you will play but not limited to:
- Design, development, and implementation of modern data pipelines, data models, and ETL/ELT processes.
- Architect and optimize data lake and warehouse solutions using Microsoft Fabric, Databricks, or Snowflake.
- Enable business analytics and self-service reporting through Power BI and other visualization tools.
- Collaborate with data scientists, analysts, and business users to deliver reliable and high-performance data solutions.
- Implement and enforce best practices for data governance, data quality, and security.
- Mentor and guide junior data engineers; establish coding and design standards.
- Evaluate emerging technologies and tools to continuously improve the data ecosystem.
Required Qualifications:
- Bachelor's degree in computer science, Information Technology, Engineering, or a related field.
- Bachelor’s/Master’s degree in Computer Science, Information Technology, Engineering, or related field.
- 5-7 years of experience in data engineering or data platform development, with at least 2–3 years in a lead or architect role.
- Strong hands-on experience in one or more of the following:
- Microsoft Fabric (Data Factory, Lakehouse, Data Warehouse)
- Databricks (Spark, Delta Lake, PySpark, MLflow)
- Snowflake (Data Warehousing, Snowpipe, Performance Optimization)
- Power BI (Data Modeling, DAX, Report Development)
- Proficiency in SQL and programming languages like Python or Scala.
- Experience with Azure, AWS, or GCP cloud data services.
- Solid understanding of data modeling, data governance, security, and CI/CD practices.
Preferred Qualifications:
- Familiarity with data modeling techniques and practices for Power BI.
- Knowledge of Azure Databricks or other data processing frameworks.
- Knowledge of Microsoft Fabric or other Cloud Platforms.
What we need?
· B. Tech computer science or equivalent.
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
- Recognized for excellence in data and AI solutions with industry awards and accolades.
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
|
Requirements: Job Description |
|
Required Skills: Cold Calling, Email Campaign, Professional Networks (LinkedIn)
- Thorough understanding of IT products and services.
- Proficient in mining & managing data through database portals & professional networks.
- Generate appointments through activities like cold calls, email campaigns, and professional networking.
- Ensure accurate procedures are followed for cold calling, email campaigns, and professional networking activities to generate appointments.
- Maintain a systematic follow-up process to nurture leads not yet ready for an appointment.
- Proficiency in using the Zoho CRM system
- Meet quarterly and yearly targets as assigned.
- Maintain a healthy lead-generation pipeline.
Expectations:
- Excellent communication, interpersonal, and phone skills
- Ability to set up appointments proficiently using all inside sales activities.
- Proven track record in cold calling, email marketing, lead generation, and networking with professionals
- Able to thrive in a fast-paced environment and consistently meet deadlines.
- Self-motivated with strong multitasking abilities, able to work independently or collaboratively within a team.
- Familiarity with CRM, major database portals and professional networking platforms
- Define software requirements for embedded software applications
- writing software requirements (SWE.1) requirements using DOORS. Includes requirements analysis from systems requirements or customer requirement documents.
- Participate in software development processes such as requirements capture, analysis, linking software requirements to system requirements and architecture, customer specifications.
- Software requirements development, flow-down and management, traceability using DOORS or equivalent requirements capture tool.
- Work within ASPICE process objectives, achievement of ASPICE levels and supporting audits
An ideal candidate must possess excellent Logical & Analytical skills. You will be working in a team as well on diverse projects. The candidate must be able to deal smoothly and confidently with the Clients & Personnel.
Key roles and Responsibilities:
⦁ Able to design and build efficient, testable and reliable code.
⦁ Should be a team player sharing ideas with the team for continuous improvement and development process.
⦁ Good Knowledge on Spring Boot, Spring MVC, J2EE and SQL Queries.
⦁ Stay updated of new tools, libraries, and best practices.
⦁ Adaptable, Self-Motivated, must be willing to learn new things.
⦁ Sound Good knowledge on HTML, CSS, JavaScript.
Basic Requirements:
⦁ Bachelors' Degree in Computer Science Engineering / IT or related discipline with a good academic record.
⦁ Excellent communication skills and interpersonal skills.
⦁ Knowledge on SDLC flow from requirement analysis to deployment phase.
⦁ Should be able to design, develop and deploy applications.
⦁ Able to identify bugs and devise solutions to address and resolve the issues.
Are you the one? Quick self-discovery test:
- Love for the cloud: When was the last time your dinner entailed an act on “How would ‘Jerry Seinfeld’ pitch Cloud platform & products to this prospect” and your friend did the ‘Sheldon’ version of the same thing.
- Passion: When was the last time you went to a remote gas station while on vacation and ended up helping the gas station owner saasify his 7 gas stations across other geographies.
- Compassion for customers: You listen more than you speak. When you do speak, people feel the need to listen.
- Humor for life: When was the last time you told a concerned CEO, ‘If Elon Musk can attempt to take humanity to Mars, why can’t we take your business to run on the cloud?
So what are we looking for?
- Experience in On-premises to AWS cloud Migration.
- Linux and Windows servers knowledge .
- Application knowledge like Java, .net, Python, Ruby.
- On-premises to Cloud migration assessment experience as a must .
- Able to provide a detailed migration analysis report and present it to the customer.
- Creative problem-solving skills and superb communication skills.
- Respond to technical queries / requests from team members and customers.
- Ambitious individuals who can work under their own direction towards agreed targets/goals.
- Ability to handle change and be open to it along with good time management and being able to work under stress.
- Proven interpersonal skills while contributing to team effort by accomplishing related results as needed.
- Maintain technical knowledge by attending educational workshops, reviewing publications.
Job Responsibilities:
- Managing initiatives for migration and modernization in AWS cloud environment
- Leads and builds Modernization architecture solution from (on-prem or VMWare) into modern platform (Cloud AWS) through modular design by understanding application components
- Leads and SME in Modernization methodology and can lead the Design thinking workshop, method tailoring as Client environment and Client industry
- The 6 most common application migration strategies below required
- Re-host (Referred to as a “lift and shift.”)
- Re-platform (Referred to as “lift, tinker, and shift.”)
- Re-factor / Re-architect
- Re-purchase
- Retire
- Retain ( Referred to as re-visit.)
- Application migration analysis experience like application compatibility on the cloud, Network, security support on cloud.
Qualifications:
- Is Education overrated? Yes. We believe so. But there is no way to locate you otherwise. So we might look for at least a Bachelor’s or Master's degree in engineering from a reputed institute or you should be programming from 12.
- And the latter is better. We will find you faster if you specify the latter in some manner. Not just a degree, but we are not too thrilled by tech certifications too :)
- Architects with 10+ total and 6+ years of experience on Modernization applications and led Architecture initiatives on AWS Modernization.
- Managed and implemented at least 5 engagement modernizing client applications to AWS Cloud and on WebSphere and Java/J2EE or .NET.
- Experience on using DevOps tools during Modernization.
- Complete in-depth experience and knowledge of AWS as a product and its components.
- AWS certification would be preferred.
- Experience in Agile fundamentals and methodology.
Ruby and Rails
Creating JSON based web services from Ruby on Rails (RoR) apps
HTML5, CSS3, JavaScript, jQuery, CoffeeScript, Ajax, lodash/underscore.js
A firm grasp of object-oriented analysis and design
Good to have knowledge in any front end framework, Angular or above versions, Backbonejs, Ember.js
Should have extensive experience in Agile software development principles, practice, and process
Should have worked on enterprise-cl applications
RoR Performance tuning and scaling
Proficiency in English strongly preferred
A proactive and resourceful person who achieves with minimal oversight
Team player with the ability and desire to become an integral part of a fast-paced team
Good knowledge of relational databases MySQL, Oracle, Microsoft SQL Server, DB2 or similar
Hands-on experience with at least one of the NoSQL environments like MongoDB, Couchbase, Cassandra
Experience with text search systems like elastic, solr or similar
Hands-on experience in integrating with third-party REST APIs
Managing code with Git and other version control tools
Good to have knowledge of Nginx
ETL Developer – Talend
Job Duties:
- ETL Developer is responsible for Design and Development of ETL Jobs which follow standards,
best practices and are maintainable, modular and reusable.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- ETL Developer will analyze and review complex object and data models and the metadata
repository in order to structure the processes and data for better management and efficient
access.
- Working on multiple projects, and delegating work to Junior Analysts to deliver projects on time.
- Training and mentoring Junior Analysts and building their proficiency in the ETL process.
- Preparing mapping document to extract, transform, and load data ensuring compatibility with
all tables and requirement specifications.
- Experience in ETL system design and development with Talend / Pentaho PDI is essential.
- Create quality rules in Talend.
- Tune Talend / Pentaho jobs for performance optimization.
- Write relational(sql) and multidimensional(mdx) database queries.
- Functional Knowledge of Talend Administration Center/ Pentaho data integrator, Job Servers &
Load balancing setup, and all its administrative functions.
- Develop, maintain, and enhance unit test suites to verify the accuracy of ETL processes,
dimensional data, OLAP cubes and various forms of BI content including reports, dashboards,
and analytical models.
- Exposure in Map Reduce components of Talend / Pentaho PDI.
- Comprehensive understanding and working knowledge in Data Warehouse loading, tuning, and
maintenance.
- Working knowledge of relational database theory and dimensional database models.
- Creating and deploying Talend / Pentaho custom components is an add-on advantage.
- Nice to have java knowledge.
Skills and Qualification:
- BE, B.Tech / MS Degree in Computer Science, Engineering or a related subject.
- Having an experience of 3+ years.
- Proficiency with Talend or Pentaho Data Integration / Kettle.
- Ability to work independently.
- Ability to handle a team.
- Good written and oral communication skills.




