- Suggest programmatic strategy and tactics that coordinate with and augment the overall media plan objectives for DB2B clients
- Create and actively manage and optimize programmatic campaigns in DSPs
- Be consistently curious and inquisitive – learn how platforms work, and more importantly, learn how to game them
- Manage campaigns across programmatic mobile, display, video, OTT, CTV, and audio
- Understand how data from various channels can help the overall customer journey for each client account
- Identify, suggest, and setup private marketplace deals
- Continually acquire knowledge by maintaining relationships with key programmatic vendors portfolio
- Actively monitor campaign KPIs via reporting
- Produce and present engaging campaign analysis to clients, demonstrating the effectiveness of planning, testing, and media optimization and management
- Provide insight and guidance to members of the integrated media team
- Maintain and stay current on industry news and research
Qualifications
- Hands-on experience with one or more of the following DSPs: Google DoubleClick Bid Manager(DV360), TheTradeDesk, DataXu, Amazon Advertising Platform, Adobe/TubeMogul, and Sizmek
- DoubleClick Campaign Manager/DoubleClick Bid Manager certifications are preferred but not required
- Strong written and verbal communication skills
- Detail-oriented and organized
- Able to multitask and work well under pressure
- Positive team player
- Proficiency in Microsoft Office applications
- Bachelor of Arts or Bachelor of Science degree

Similar jobs
Work mode- WFO 5 days
Location: Hyderabad (Onsite)
Experience- 7+
- K8s Hands-on experience
- Linux Troubleshooting Skills
- Experience on OnPrem Servers and Management
- Helm
- Docker
- Ingress and Ingress Controllers
- Networking Basics
- Proficient Communication
Must-Have Skills:
- Hands-on experience with airgap Kubernetes clusters, ideally in regulated industries (finance, healthcare, etc.).
- Strong expertise in CI/CD pipelines, programmable infrastructure, and automation.
- Proficiency in Linux troubleshooting, observability (Prometheus, Grafana, ELK), and multi-region disaster recovery.
- Security & compliance knowledge for regulated industries.
- Preferred: Experience with GKE, RKE, Rook-Ceph, and certifications like CKA, CKAD.
Who You Are
- A Kubernetes expert who thrives on scalability, automation, and security.
- Passionate about optimizing infrastructure, CI/CD, and high-availability systems.
- Comfortable troubleshooting Linux, improving observability, and ensuring disaster recovery readiness.
- A problem solver who simplifies complexity and drives cloud-native adoption.
What You’ll Do
- Architect & automate Kubernetes solutions for airgap and multi-region clusters.
- Optimize CI/CD pipelines & cloud-native deployments.
- Work with open-source projects, selecting the right tools for the job.
- Educate & guide teams on modern cloud-native infrastructure best practices.
- Solve real-world scaling, security, and infrastructure automation challenges.
Why Join Us?
- Work on high-impact Kubernetes projects in regulated industries.
- Solve real-world automation & infrastructure challenges with cutting-edge tools.
- Grow in a team that values learning, open-source contributions, and innovation.
Note: Salary will be offered based on your overall experience and last drawn salary.
Job Title: Python / Django Backend Developer
Experience: 3+ Years
Location: Gurgaon (Onsite)
Work Mode: 5 Days Working
About the Company
We are hiring for a product-based global furniture and homeware organization with operations in the UK and India. The company builds and maintains in-house digital platforms focused on design-to-delivery, supply-chain, and logistics. The team focuses on building scalable, high-performance internal systems.
Roles & Responsibilities
- Design, develop, and maintain RESTful APIs and backend services using Python & Django / Django REST Framework
- Build scalable, secure, and optimized database schemas and queries using PostgreSQL/MySQL
- Collaborate with frontend, product, and QA teams for end-to-end feature delivery
- Write clean, reusable, and testable code following best engineering practices
- Optimize application performance, reliability, and scalability
- Participate in code reviews, documentation, and CI/CD processes
- Deploy and manage backend services on cloud infrastructure and web servers
Required Skills & Qualifications
- Strong proficiency in Python and Django / Django REST Framework
- Solid understanding of relational databases (PostgreSQL/MySQL)
- Experience with REST API design, authentication & authorization
- Working knowledge of AWS services: EC2, ELB, S3, IAM, RDS
- Experience configuring and managing Nginx/Apache
- Familiarity with Git, Docker, and CI/CD workflows
- Strong problem-solving and debugging skills
Preferred Qualifications
- Experience with cloud platforms (AWS/GCP/Azure)
- Familiarity with microservices architecture
- Experience with Celery, RabbitMQ, Kafka
- Knowledge of testing frameworks (Pytest, unittest)
- Exposure to e-commerce platforms or high-traffic scalable systems
Who Are We
A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.
ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.
WHY US
- Join top AI company
- Grow with your best companions
- Continuous pursuit of excellence, equality, respect
- Competitive compensation and benefits
You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.
To learn more about how we work, please check out
Description:
We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.
Responsibilities:
- Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
- Lead a team of ML engineers in developing an industrial AI product from scratch
- Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
- Tune the models to achieve high accuracy rates and minimum latency
- Deploying developed computer vision models on edge devices after optimization to meet customer requirements
Requirements:
- Bachelor’s degree
- Understanding about depth and breadth of computer vision and deep learning algorithms.
- 4+ years of industrial experience in computer vision and/or deep learning
- Experience in taking an AI product from scratch to commercial deployment.
- Experience in Image enhancement, object detection, image segmentation, image classification algorithms
- Experience in deployment with OpenVINO, ONNXruntime and TensorRT
- Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
- Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
- Proficient understanding of code versioning tools, such as Git
Our perfect candidate is someone that:
- is proactive and an independent problem solver
- is a constant learner. We are a fast growing start-up. We want you to grow with us!
- is a team player and good communicator
What We Offer:
- You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
- You will be in charge of what you build and be an integral part of the product development process
- Technical and financial growth!
Job Title: Kafka Architect
Experience range:- 8+ Years
Location:- Pune
Experience :
Kafka Architect with kafka connect, kafka Streaming, Ecosystem and any scripting language
Kafka Brokers, Kafka Connect, Schema Registry, and Zookeeper (or KRaft)
Experience: 5 - 13 Years
Location: Chennai
Notice Period: 30 Days
Required skills:
- Good knowledge in PHP , Linux and DB.
- Good knowledge in Laravel, Codeigniter Framework or any Framework.
- Good knowledge in Javascript.
- Candidate must have a good experience in Practical PHP, OOPS & SQL concepts.
Selected candidate's day-to-day responsibilities include:
-
Speaking to parents (leads given by us) and:
-
Convincing them to book a consultation (paid or free, as applicable)
-
Upselling/converting to paid subscriptions/courses and programs, wherever applicable
-
- Communicating with prospects via email and whatsapp
- Creating and maintaining MIS and daily reports
- Maximising automation within the organization.
- Understand various functions’ systems & automation-related needs and design & implement solutions for them.
- Manage Learning Management System (LMS) from the technical side including coordination with the LMS provider for white-label LMS development.
- Provide necessary training to MML employees & students for optimal utilisation of automation & technology solutions.
- Add value to the growth of the organization's top line & bottom line.
Performance Testing
With C#/Java, and basic coding knowledge.
At Regions, the Performance Engineer conducts a
wide range of quality control tests and analysis to ensure that software meets or exceeds specified standards and end-user requirements as well monitor the performance of the production systems.
Primary Responsibilities
- Collaborates with other engineers to develop
testing and monitoring approaches
- Focuses on assurance in the areas of error
rates, response times, and impact to infrastructure
- Designs, scripts, configures, and run
performance tests to validate production readiness of software applications and
infrastructure for stability and performance
- Supports testing needs of new projects,
infrastructure upgrades, and application enhancements by validating application
functionality against documented test cases
- Uses dashboards that enable efficient
monitoring of applications and systems running in production environments to
troubleshoot issues and report finding.
1. Developing high-quality content with unique SEO optimized article for the given topic
2. Writing, reviewing and publishing web content, proactively update and handle website landing pages and blogs.
3. Candidate must have ideas of government exams
3. Adding relevant images/screenshots to the article










