

Similar jobs

Role Proficiency:
This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required.
Skill Examples:
- Proficiency in SQL Python or other programming languages used for data manipulation.
- Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF.
- Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery).
- Conduct tests on data pipelines and evaluate results against data quality and performance specifications.
- Experience in performance tuning.
- Experience in data warehouse design and cost improvements.
- Apply and optimize data models for efficient storage retrieval and processing of large datasets.
- Communicate and explain design/development aspects to customers.
- Estimate time and resource requirements for developing/debugging features/components.
- Participate in RFP responses and solutioning.
- Mentor team members and guide them in relevant upskilling and certification.
Knowledge Examples:
- Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF.
- Proficient in SQL for analytics and windowing functions.
- Understanding of data schemas and models.
- Familiarity with domain-related data.
- Knowledge of data warehouse optimization techniques.
- Understanding of data security concepts.
- Awareness of patterns frameworks and automation practices.
Additional Comments:
# of Resources: 22 Role(s): Technical Role Location(s): India Planned Start Date: 1/1/2026 Planned End Date: 6/30/2026
Project Overview:
Role Scope / Deliverables: We are seeking highly skilled Data Engineer with strong experience in Databricks, PySpark, Python, SQL, and AWS to join our data engineering team on or before 1st week of Dec, 2025.
The candidate will be responsible for designing, developing, and optimizing large-scale data pipelines and analytics solutions that drive business insights and operational efficiency.
Design, build, and maintain scalable data pipelines using Databricks and PySpark.
Develop and optimize complex SQL queries for data extraction, transformation, and analysis.
Implement data integration solutions across multiple AWS services (S3, Glue, Lambda, Redshift, EMR, etc.).
Collaborate with analytics, data science, and business teams to deliver clean, reliable, and timely datasets.
Ensure data quality, performance, and reliability across data workflows.
Participate in code reviews, data architecture discussions, and performance optimization initiatives.
Support migration and modernization efforts for legacy data systems to modern cloud-based solutions.
Key Skills:
Hands-on experience with Databricks, PySpark & Python for building ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, version control (Git), and workflow orchestration (Airflow preferred).
Excellent problem-solving, communication, and collaboration skills.
Skills: Databricks, Pyspark & Python, Sql, Aws Services
Must-Haves
Python/PySpark (5+ years), SQL (5+ years), Databricks (3+ years), AWS Services (3+ years), ETL tools (Informatica, Glue, DataProc) (3+ years)
Hands-on experience with Databricks, PySpark & Python for ETL/ELT pipelines.
Proficiency in SQL (performance tuning, complex joins, CTEs, window functions).
Strong understanding of AWS services (S3, Glue, Lambda, Redshift, CloudWatch, etc.).
Experience with data modeling, schema design, and performance optimization.
Familiarity with CI/CD pipelines, Git, and workflow orchestration (Airflow preferred).
******
Notice period - Immediate to 15 days
Location: Bangalore
- Strong Enterprise account executive / Enterprise sales executive Profile with focus on Global Markets
- Mandatory (Experience 1): Must have 3+ years of experience in B2B Tech / SaaS sales.
- Mandatory (Experience 2): Must have had annual sales quotas of $250K+ with average deal sizes of $20K+
- Mandatory (Experience 3): Must have experience working in global markets (APAC / US / UK or similar), India & Middle East–only experience is not acceptable.
- Mandatory (Compensation): Candidate's Total CTC should be above 30L (Fixed+Variable)
- Mandatory (Company): Must be from a B2B SaaS Product Company
We’re looking for a Senior PPC Specialist with at least 5 years of experience managing paid campaigns across Google, Meta, and other digital platforms. You’ll be responsible for building, optimising, and scaling paid media campaigns that deliver measurable results for our clients.
What You’ll Do:
• Plan, create, and manage PPC campaigns on Google Ads, Facebook, and Instagram.
• Handle ad budgets of $10,000+ per month across search, display, and video campaigns.
• Develop strategies to improve ROI, conversion rates, and overall campaign efficiency.
• Create and test compelling ad copy, visuals, and targeting to maximise results.
• Build retargeting and eCommerce campaigns that drive real conversions.
• Analyse data in GA4 and prepare reports using Looker Studio.
• Regularly A/B test landing pages, audiences, and creative to find what works best.
• Collaborate with cross-functional teams to align ad performance with business goals.
• Prepare campaign proposals and pricing strategies for new clients.
What You Bring:
• 5+ years of hands-on experience in PPC campaign management.
• Strong understanding of Google Ads, Meta Ads, and other paid media platforms.
• Experience managing campaigns for international clients (US market experience is a plus).
• Solid grasp of data analysis and ability to turn insights into actions.
• Excellent communication skills — written and verbal.
• A proactive mindset that you enjoy testing, learning, and improving every day.
• Bonus: Copywriting skills and an eye for creativity that converts.
Why E2M:
• A collaborative, no-drama work culture that values initiative.
• Flexible and remote-friendly work environment.
• Opportunities to grow, experiment, and make an impact
- Required minimum 1 year of Experience.
- Experience on Engine Versions: Unreal Engine 4.27, 5, and above
- Expertise in both Blueprints and C++
- Ability to build scalable and modular gameplay systems
- Nice understanding of core gameplay architecture
- Experience working with UMGs
- Ability to optimize performance for PC and console platforms
- Familiar with game loop architecture, tick handling, and event-driven programming
- Capable of writing clean, maintainable, and modular code
- Experienced in debugging, profiling, and performance analysis
- Familiar with code versioning tools like Git, Perforce, SVN, or Plastic SCM
- Ability to integrate and test third-party plugins or systems
- Passionate about learning and implementing new Unreal Engine features
- Able to work independently as well as in cross-functional teams
- Strong communication skills
- Basic understanding of shader graphs and material system (preferred)
- Eager to contribute to both game and non-game creative solutions
- Experience with simulation projects like training tools, infrastructure planning, or visualization systems
- Familiar with animation systems (characters and vehicles)
- Familiarity with AI systems like Behavior Trees and NavMesh
- Understanding of replication and multiplayer mechanics
- Experience with Niagara or Cascade particle systems
We are looking for a highly motivated senior developer with at least 3+ years of strong hands-on experience in Java to join our startup. You would be playing pivotal role in contributing to the initial tech stack. You would be further responsible for designing and implementing product requirements that are highly usable, scalable, extensible and maintainable. You should be comfortable on working across different technologies/frameworks that we work on - Microservices, Java, Spring, Spring Boot, MySQL, Kubernetes, AWS.
Responsibilities and Duties:
- Design and build scalable REST APIs on Spring Boot
- Develop, test, tune for performance and deploy microservices
- Collaborate with the team, optimize and refactor the back-end architecture
- Maintain high standards of quality for code, documentation and other deliverables
- Active cross-team coordination would be expected
- Architecture and tech stack discussions to optimize for increasing server load
- DevOps tasks along with AWS features exploration
What are we looking for?
- Strong 3+ year experience in Core Java & backend technologies
- Good working knowledge of design patterns & OOAD
- Excellent analytical and problem-solving skills
- The skills that we consider: Java, MySQL/RDS, Spring/ Play, Maven, Redis, Kafka/SQS, Elasticsearch, AWS
- Experience with designing, implementing and deploying microservices
- Previously worked in a startup
Responsibilities:
- Help design and implement functional requirements
- Build efficient back-end features in Python
- Integrate front-end components into applications
- Manage testing and bug fixes
- Prepare technical documentation
- Collaborate with UX/UI designers to implement design into the code
- Code review
- Implement software enhancements and suggest improvements
What we are looking for:
- Solid experience as Python Developer
- Experience with Python frameworks (e.g. Django, Flask, Bottle)
- Familiarity with Amazon Web Services (AWS) and REST API
- Understanding of databases and SQL
- Attention to detail
- Leadership skills
- Self-starter, able to work independently
Bonus skills:
- Cloud deployment services - Docker, Kubernetes/AWS/Azure/Openshift/GCS etc.
- API deployment / WSGI frameworks - Flask/Django/Bottle/FastAPI etc.
- Basic database operations with Python (CRUD)
Hello Greetings!!!
We are looking for “Angular Developer " for Goregaon (Mumbai)/ Lucknow location.
Experience: 3 - 5 Years
Notice Period: up to 30 days
Location: Goregaon (Mumbai)/ Lucknow
Responsibilities and Deliverables:
- Ability to translate the UI designs in to fully-functional interactive prototypes
- Complete familiarity with web technologies HTML5/CSS3/JavaScript, Jquery and AJAX
- Should be able to design a responsive website
- Should have knowledge of working with modularized structures
- Thorough knowledge on JS frameworks and knowledge of OOPs & MVC.
- Should be able in designing the frontend application and integrating the API's to show the data in frontend.
- Designing the frontend application using HTML and CSS or using bootstrap as well as making the webpage responsive for mobile applications as well.
- Checking the cross-browser compatibility with different browsers. Integrating the API's in angular to show the data in frontend as per the logic provided.
- Strong skills on framework Angular/ React.JS/Ionic
- Data visualizations using Canvas or SVG, High chart, D3.js etc.
- Good Understanding on RWD, SPA & Hybrid app development
- Knowledge of XML, JSON, REST-API & Web services is essential
- Experience working with CSS preprocessors, Bootstrap, Material design
- Familiarity with Testing Automation and Deployment automation tools.
Please provide the following details:
- Current Location:
- Present CTC
- Expected CTC
- Notice Period








