
We CondéNast are looking for a Support engineer Level 2 who would be responsible for
monitoring and maintaining the production systems to ensure the business continuity is
maintained. Your Responsibilities would also include prompt communication to business
and internal teams about process delays, stability, issue, resolutions.
Primary Responsibilities
● 5+ years experience in Production support
● The Support Data Engineer is responsible for monitoring of the data pipelines
that are in production.
● Level 3 support activities - Analysing issue, debug programs & Jobs, bug fix
● The position will contribute to the monitoring, rerun or reschedule, code fix
of pipelines for a variety of projects on a daily basis.
● Escalate failures to Data-Team/DevOps incase of Infrastructure Failures or unable
to revive the data-pipelines.
● Ensure accurate alerts are raised incase of pipeline failures and corresponding
stakeholders (Business/Data Teams) are notified about the same within the
agreed upon SLAs.
● Prepare and present success/failure metrics by accurately logging the
monitoring stats.
● Able to work in shifts to provide overlap with US Business teams
● Other duties as requested or assigned.
Desired Skills & Qualification
● Have Strong working knowledge of Pyspark, Informatica, SQL(PRESTO), Batch
Handling through schedulers(databricks, Astronomer will be an
advantage),AWS-S3, SQL, Airflow and Hive/Presto
● Have basic knowledge on Shell scripts and/or Bash commands.
● Able to execute queries in Databases and produce outputs.
● Able to understand and execute the steps provided by Data-Team to
revive data-pipelines.
● Strong verbal, written communication skills and strong interpersonal
skills.
● Graduate/Diploma in computer science or information technology.
About Condé Nast
CONDÉ NAST GLOBAL
Condé Nast is a global media house with over a century of distinguished publishing
history. With a portfolio of iconic brands like Vogue, GQ, Vanity Fair, The New Yorker and
Bon Appétit, we at Condé Nast aim to tell powerful, compelling stories of communities,
culture and the contemporary world. Our operations are headquartered in New York and
London, with colleagues and collaborators in 32 markets across the world, including
France, Germany, India, China, Japan, Spain, Italy, Russia, Mexico, and Latin America.
Condé Nast has been raising the industry standards and setting records for excellence in
the publishing space. Today, our brands reach over 1 billion people in print, online, video,
and social media.
CONDÉ NAST INDIA (DATA)
Over the years, Condé Nast successfully expanded and diversified into digital, TV, and
social platforms - in other words, a staggering amount of user data. Condé Nast made the
right move to invest heavily in understanding this data and formed a whole new Data
team entirely dedicated to data processing, engineering, analytics, and visualization. This
team helps drive engagement, fuel process innovation, further content enrichment, and
increase market revenue. The Data team aimed to create a company culture where data
was the common language and facilitate an environment where insights shared in
real-time could improve performance. The Global Data team operates out of Los Angeles,
New York, Chennai, and London. The team at Condé Nast Chennai works extensively with
data to amplify its brands' digital capabilities and boost online revenue. We are broadly
divided into four groups, Data Intelligence, Data Engineering, Data Science, and
Operations (including Product and Marketing Ops, Client Services) along with Data
Strategy and monetization. The teams built capabilities and products to create
data-driven solutions for better audience engagement.
What we look forward to:
We want to welcome bright, new minds into our midst and work together to create
diverse forms of self-expression. At Condé Nast, we encourage the imaginative and
celebrate the extraordinary. We are a media company for the future, with a remarkable
past. We are Condé Nast, and It Starts Here.

Similar jobs
At Technoidentity, we’re a Data + AI product engineering company delivering scalable, modern enterprise solutions.
We are seeking a seasoned Senior Database Developer & PostgreSQL Expert with 8–10 years of experience across multiple database systems, with deep hands-on expertise in PostgreSQL internals, performance tuning, advanced indexing, and enterprise data architecture.
With a strong preference for Oracle to AlloyDB (PostgreSQL) migration expertise, this role blends modernization, advanced PostgreSQL engineering, and cloud-native architecture on Google Cloud Platform (GCP). If you thrive in transforming complex data logic, optimizing performance, and architecting enterprise-grade PostgreSQL systems, this opportunity is built for you.
Requirements
Key Responsibilities
Database Migration & Development
- Lead database migrations from Oracle to PostgreSQL-based platforms like AlloyDB, applying both manual and automated strategies.
- Re-engineer stored procedures, triggers, and packages from PL/SQL to PL/pgSQL.
- Recreate and optimize DB objects (schemas, constraints, views, indexes) across target platforms.
- Design PostgreSQL-specific architecture including partitioning strategies, indexing plans (GIN, GiST, BRIN), and query optimization paths.
- Leverage PostgreSQL internals such as vacuum tuning, WAL configuration, and connection management for optimal system performance.
Performance Optimization
- Implement indexing strategies, query plans, and table partitioning to enhance database performance.
- Use tools like EXPLAIN ANALYZE, pg_stat_statements, and GCP-native monitoring dashboards for tuning.
- Collaborate with DevOps for deployment pipelines and infrastructure-as-code (IaC) best practices.
- Perform deep-dive PostgreSQL performance fine-tuning including buffer cache analysis, autovacuum configuration, and planner optimization.
- Optimize PostgreSQL workloads for high concurrency and large datasets, ensuring stable and predictable performance.
Data Integrity & Validation
- Build and execute validation scripts for source-to-target comparison.
- Ensure row-level data accuracy, transformation logic fidelity, and zero data loss post-migration.
- Manage large-scale datasets for archival and bulk processing tasks.
- Implement PostgreSQL-native data integrity strategies like constraint management, row-level security (RLS), and trigger-based validations.
Collaboration & Documentation
- Work with cross-functional teams to translate business logic and legacy workflows to scalable, modern database systems.
- Maintain clear documentation on schema conversions, validation methods, rollback strategies, and change history.
- Create PostgreSQL architecture documents, optimization playbooks, and database best-practice guidelines for engineering teams.
Ideal Candidate Profile
Experience & Skills
- 4–10 years of database development experience across Oracle, PostgreSQL, and similar RDBMS platforms.
- Strong expertise in PL/SQL and PL/pgSQL for procedural logic, error handling, and performance tuning.
- Proven track record of large-scale database migration projects (Oracle to PostgreSQL/AlloyDB preferred).
- Proficient in query optimization, indexing, partitioning, and schema normalization.
🌍 We’re Hiring: Senior Field AI Engineer | Remote | Full-time
Are you passionate about pioneering enterprise AI solutions and shaping the future of agentic AI?
Do you thrive in strategic technical leadership roles where you bridge advanced AI engineering with enterprise business impact?
We’re looking for a Senior Field AI Engineer to serve as the technical architect and trusted advisor for enterprise AI initiatives. You’ll translate ambitious business visions into production-ready applied AI systems, implementing agentic AI solutions for large enterprises.
What You’ll Do:
🔹 Design and deliver custom agentic AI solutions for mid-to-large enterprises
🔹 Build and integrate intelligent agent systems using frameworks like LangChain, LangGraph, CrewAI
🔹 Develop advanced RAG pipelines and production-grade LLM solutions
🔹 Serve as the primary technical expert for enterprise accounts and build long-term customer relationships
🔹 Collaborate with Solutions Architects, Engineering, and Product teams to drive innovation
🔹 Represent technical capabilities at industry conferences and client reviews
What We’re Looking For:
✔️ 7+ years of experience in AI/ML engineering with production deployment expertise
✔️ Deep expertise in agentic AI frameworks and multi-agent system design
✔️ Advanced Python programming and scalable backend service development
✔️ Hands-on experience with LLM platforms (GPT, Gemini, Claude) and prompt engineering
✔️ Experience with vector databases (Pinecone, Weaviate, FAISS) and modern ML infrastructure
✔️ Cloud platform expertise (AWS, Azure, GCP) and MLOps/CI-CD knowledge
✔️ Strategic thinker able to balance technical vision with hands-on delivery in fast-paced environments
✨ Why Join Us:
- Drive enterprise AI transformation for global clients
- Work with a category-defining AI platform bridging agents and experts
- High-impact, customer-facing role with strategic influence
- Competitive benefits: medical, vision, dental insurance, 401(k)
Responsibilities of a Senior Software Engineer (Backend):
- You will be a guide in the entire application lifecycle including research, design, development, testing (dev owns quality), along with continuous deployment and delivery
- Lead design with a major focus on best user experience, performance, scalability and future expansion
- You will act as a mentor for less-experienced peers through both your technical knowledge and leadership skills to bring in continuous improvements and implement best practices.
- You will apply the latest technology thinking from our tech radar, adopt best design practices to solve complex problems and ensure our product is the best in usability.
- You will work directly with the CEO and the cross functional product teams, align on the needs of the products and operations.
- You will analyze business, technology challenges and suggest solutions
- Build an “Awesome” team
This opportunity is for you if,
- You have 4-6 Years of work experience in building highly-interactive applications using Java, Spring MVC, Spring JMS, Spring JDBC, Spring IOC, Spring Boot, MySQL, MiBatis, ReactJS, Java Script, jQuery, AWS, JMS and have knowledge on wide range of web technologies, tools and Frameworks.
- You have experience in working with Restful webservices
- You have strong knowledge and understanding of design patterns and domain driven design
- You understand fundamental design principles behind building scalable, resilient, and maintainable applications
- You have functional knowledge of inclusive design: accessibility, and related tooling
- You have consumer web development experience for high-traffic, public facing web applications
- You have knack for writing clean, readable, re-usable code
- You have good knowledge on TDD and CI/CD practices.
- You have experience in managing and publishing releases using Git branching and tools like NPM and YARN.
- You have penchant for learning
- You are a great analytical & logical thinker and someone who loves solving problems
- You are passionate, energetic, enthusiastic and a go-getter
- You are “ FUN @ Work “
Excellent C, C++, design skills with multithreading
Experience on Network Programming with Sockets, TCP/IP protocols (TCP, UDP, DNS, DHCP etc.) and application layer protocols like SNMP, HTTP/S, FTP, IPP, LPR, WSD
Ability to study and analyze network packets
Duties and Responsibilities:
Research and Develop Innovative Use Cases, Solutions and Quantitative Models
Quantitative Models in Video and Image Recognition and Signal Processing for cloudbloom’s
cross-industry business (e.g., Retail, Energy, Industry, Mobility, Smart Life and
Entertainment).
Design, Implement and Demonstrate Proof-of-Concept and Working Proto-types
Provide R&D support to productize research prototypes.
Explore emerging tools, techniques, and technologies, and work with academia for cutting-
edge solutions.
Collaborate with cross-functional teams and eco-system partners for mutual business benefit.
Team Management Skills
Academic Qualification
7+ years of professional hands-on work experience in data science, statistical modelling, data
engineering, and predictive analytics assignments
Mandatory Requirements: Bachelor’s degree with STEM background (Science, Technology,
Engineering and Management) with strong quantitative flavour
Innovative and creative in data analysis, problem solving and presentation of solutions.
Ability to establish effective cross-functional partnerships and relationships at all levels in a
highly collaborative environment
Strong experience in handling multi-national client engagements
Good verbal, writing & presentation skills
Core Expertise
Excellent understanding of basics in mathematics and statistics (such as differential
equations, linear algebra, matrix, combinatorics, probability, Bayesian statistics, eigen
vectors, Markov models, Fourier analysis).
Building data analytics models using Python, ML libraries, Jupyter/Anaconda and Knowledge
database query languages like SQL
Good knowledge of machine learning methods like k-Nearest Neighbors, Naive Bayes, SVM,
Decision Forests.
Strong Math Skills (Multivariable Calculus and Linear Algebra) - understanding the
fundamentals of Multivariable Calculus and Linear Algebra is important as they form the basis
of a lot of predictive performance or algorithm optimization techniques.
Deep learning : CNN, neural Network, RNN, tensorflow, pytorch, computervision,
Large-scale data extraction/mining, data cleansing, diagnostics, preparation for Modeling
Good applied statistical skills, including knowledge of statistical tests, distributions,
regression, maximum likelihood estimators, Multivariate techniques & predictive modeling
cluster analysis, discriminant analysis, CHAID, logistic & multiple regression analysis
Experience with Data Visualization Tools like Tableau, Power BI, Qlik Sense that help to
visually encode data
Excellent Communication Skills – it is incredibly important to describe findings to a technical
and non-technical audience
Capability for continuous learning and knowledge acquisition.
Mentor colleagues for growth and success
Strong Software Engineering Background
Hands-on experience with data science tools
- Developing both frontend and backend of application using technologies like Java, Angular, JavaScript, HTML5.
- Designing user interactions on web pages.
- Designing and Developing back end API's and databases.
- Meeting both technical and customer’s needs.
- Seeing through a project from conception to finished product.
- Take complete ownership of the project and deliver with good quality.
- Staying abreast of developments in web applications and programming languages.
Requirements:
- Relevant experience of 8+ yrs
- Strong communication, organizational skills. Good in Agile, Scrum, Dev-Ops practices.
- Proficiency with fundamental front end languages such as HTML5, CSS and JavaScript.
- Familiarity with JavaScript frameworks such as Angular, React and Amber.
- Proficiency in Java, Springboot, Microservices
- Familiarity with database technology such as PostgreSQL, MySQL, Oracle and MongoDB.
- Good in writing quality code and Unit Tests. Good knowledge of DevOps practices.
- Excellent verbal communication skills.
- Good problem solving skills
● You’ve been building the backend for web applications.
● You have experience with any of these backend programming languages --
Python, NodeJS or Java.
● You write an understandable, testable code with an eye towards
maintainability.
● You are a strong communicator. Explaining complex technical concepts to
designers, support, and other engineers is no problem for you.
● You possess strong computer science fundamentals: data structures,
algorithms, programming languages, distributed systems, and information
retrieval.
● You have completed a bachelor's degree in Computer Science, Engineering or
related field, or equivalent training, fellowship, or work experience.








