
1. With Good communication Skills
2. Only Female Candidates are required
3. Minimum of 1yr of Tellecalling experience
4. Experience from Educational Industry is an added advantage

About Reqroots
About
Connect with the team
Similar jobs
About the Role:
We are seeking a skilled Data Engineer to join our growing AdTech team. In this role, you will design, build, and maintain high-performance ETL pipelines and large-scale data processing systems. You will work with massive datasets and distributed frameworks to power Adsremedy’s data-driven advertising solutions across Programmatic, In-App, CTV, and DOOH platforms.
What You’ll Do:
- Design, develop, and maintain scalable ETL pipelines on self-managed infrastructure
- Process and optimize large-scale datasets (terabytes of data) with high reliability and performance
- Build robust data processing workflows using Apache Spark (preferred) and/or Apache Flink
- Integrate, clean, and transform data from multiple internal and external sources
- Partner closely with data scientists, analysts, and business stakeholders to enable actionable insights
- Monitor, troubleshoot, and optimize data pipelines for operational excellence
- Ensure data quality, consistency, and performance across all data workflows
- Participate in code reviews and uphold best practices in data engineering
- Collaborate with QA teams to deliver production-ready, reliable systems
- Mentor junior engineers and promote knowledge sharing within the team
- Stay current with emerging data engineering tools, frameworks, and industry trends
What You’ll Need:
- 2+ years of experience building ETL pipelines using Apache Spark and/or Apache Flink
- Hands-on experience with big data caching solutions such as ScyllaDB, Aerospike, or similar
- Strong understanding of data lake architectures and tools like Delta Lake
- Proven experience handling terabytes of data in distributed environments
- Proficiency in Scala, Python, or Java
- Experience working with cloud data platforms (AWS S3, Azure Data Lake, Google BigQuery)
- Strong knowledge of SQL, data modeling, and data warehousing concepts
- Familiarity with Git and CI/CD workflows
- Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment
Nice to Have
- Experience with Apache Kafka for real-time data streaming
- Familiarity with Apache Airflow or similar orchestration tools
Requirements:
- Should have sound knowledge of PHP, MySQL, JQuery & javascript.
- Should know MVC framework Yii/Yii2
- Knowledge of mobile app APIs, JSON, etc.
- Knowledge of version control systems like GIT is a plus.
- Strong analytical and problem-solving skills.
- Troubleshoot and fix any issues relating to PHP programs.
Highlights:
- Working 5 days a week.
- Group Health Insurance for our employees.
- Work with a team of 200+ excellent engineers.
- Extra Compensation for Night Shifts.
- Additional Salary for an extra day spent in the office.
- Lunch buffets for all our employees.
- Yearly and quarterly awards with CASH amount, Dinner coupons, etc
- Team Dinners on Project Completion.
- Festival celebration, month end celebration and much more.
Google Data Engineer - SSE
Position Description
Google Cloud Data Engineer
Notice Period: Immediate to 30 days serving
Job Description:
We are seeking a highly skilled Data Engineer with extensive experience in Google Cloud Platform (GCP) data services and big data technologies. The ideal candidate will be responsible for designing, implementing, and optimizing scalable data solutions while ensuring high performance, reliability, and security.
Key Responsibilities:
• Design, develop, and maintain scalable data pipelines and architectures using GCP data services.
• Implement and optimize solutions using BigQuery, Dataproc, Composer, Pub/Sub, Dataflow, GCS, and BigTable.
• Work with GCP databases such as Bigtable, Spanner, CloudSQL, AlloyDB, ensuring performance, security, and availability.
• Develop and manage data processing workflows using Apache Spark, Hadoop, Hive, Kafka, and other Big Data technologies.
• Ensure data governance and security using Dataplex, Data Catalog, and other GCP governance tooling.
• Collaborate with DevOps teams to build CI/CD pipelines for data workloads using Cloud Build, Artifact Registry, and Terraform.
• Optimize query performance and data storage across structured and unstructured datasets.
• Design and implement streaming data solutions using Pub/Sub, Kafka, or equivalent technologies.
Required Skills & Qualifications:
• 8-15 years of experience
• Strong expertise in GCP Dataflow, Pub/Sub, Cloud Composer, Cloud Workflow, BigQuery, Cloud Run, Cloud Build.
• Proficiency in Python and Java, with hands-on experience in data processing and ETL pipelines.
• In-depth knowledge of relational databases (SQL, MySQL, PostgreSQL, Oracle) and NoSQL databases (MongoDB, Scylla, Cassandra, DynamoDB).
• Experience with Big Data platforms such as Cloudera, Hortonworks, MapR, Azure HDInsight, IBM Open Platform.
• Strong understanding of AWS Data services such as Redshift, RDS, Athena, SQS/Kinesis.
• Familiarity with data formats such as Avro, ORC, Parquet.
• Experience handling large-scale data migrations and implementing data lake architectures.
• Expertise in data modeling, data warehousing, and distributed data processing frameworks.
• Deep understanding of data formats such as Avro, ORC, Parquet.
• Certification in GCP Data Engineering Certification or equivalent.
Good to Have:
• Experience in BigQuery, Presto, or equivalent.
• Exposure to Hadoop, Spark, Oozie, HBase.
• Understanding of cloud database migration strategies.
• Knowledge of GCP data governance and security best practices.
Field Sales
Home loan & lap
Sales through open market
Customers for generating business through DSA's and connectors
Role : Lead – Influencer Marketing
Overview of job -
Our client is the world’s largest media investment company and is a part of WPP. In fact, we are responsible for one in every three ads you see globally. We are looking to hire a Partner – Influencer Marketing to join us. In this role, you will be leading the advocacy narrative for the largest FMCG organization in India. An individual with expertise in digital marketing/planning, an entrepreneurial mindset and a knack for influencer marketing are key areas of strength that will be expected of the right candidate. Ready to build the influencer marketing ecosystem for working with a team of 15+ young minds specializing in the space of digital & influencer marketing. The role entails managing and building the influencer scope for more than 50 brands.
The 5 key pillars of the job -
A) Liaison with the global team and local to maintain and further develop a strategic narrative on influencer marketing
B) Blur lines between digital planning and influencer planning to drive effectiveness
C) Develop and deliver creative - first of its kind advocacy solutions for brands
D) Create a measurable framework on influencer marketing. E) Understanding of social media, digital marketing and content marketing ecosystem will be important elements to this role.
We are currently looking for a 10+ years’ experience individual, based out of Mumbai to join us.
Minimum qualifications:
• Graduate – Mandatory, Post graduate – Preferred
• Minimum 10+ years’ experience with 4+ years of experience in digital marketing/digital planning/performance marketing
• Should have previously managed a team
• Experience in influencer marketing, Social media, and ecommerce
Profile Summary:
Performance engineers should be proficient in any of the programming languages. One should not only a programmer but one who can do testing. Performance Engineer must then run those tests, analyse the results, and provide appropriate solutions to help enhance system performance, reliability, and scalability. They are also often required to work with engineers and developers to perform bug fixes.
Skills Required:
1. A performance engineer must be familiar with either one of the C, C++, C# or .Net, Java, Python (programming languages). Strong knowledge on any one of the programming languages is must.
2. Strong Software Development Skills
3. Experience with Logging and Performance Tools
Tools: Apache JMeter, MS VSTS, Shell, Jenkins, Dynatrace, Datadog, Splunk
4. Program Scripting
5. Experience in Database Profiling with one the standard database- SQL Server, MS SQL
6. Strong logical reasoning/ building
Experience: 3-4+ yrs
Profile: Backend Developer (Frappe/ Flask/ Python) - ERPNext Developer Location: Delhi NCR, Can consider remote as well
About Expat Orbit
At Expat Orbit, we are building India’s First Expat Only Platform powered by tech-based innovation and wide experience in expatriate consultancy to provide end-to-end handholding support to expatriates (foreigners working in another country) and global companies . In less than 3 years, we have added some major MNCs like Benetton, Suez, Home Credit, Singapore Angel Network, among others to our clientele and are already servicing more than 8 nationalities. The next 5-year plan includes strengthening B2B & B2C presence in India, while expanding presence in other geographies.
Our service suite includes expat compliances (taxation, immigration, social security, corporate expat policy module consulting), expat logistics (accommodation, commuting solutions, family support), unique mobile application (tourism, online and offline events, cultural workshops, expat community, expat-focused content) enabling a better expat experience in India.
Website - https://expatorbit.com/ LinkedIn – Expat Orbit
Skills Required:
-
Experience in Frappe, ERPNext, JS, MySQL, Python Frappe based web development.
-
MySQL on MariaDB or equivalent
-
Ability to build programming logic for business scenarios.
-
Create web applications in frappe framework.
-
Frappe Framework implementation and customization knowledge in Python
-
Hands-on building custom doctype on Frappe
-
Hands-on Frappe API Integration
-
Knowledge of Hooks In frappe
-
Experience in front end web development using Html, CSS, Jinja and other required frameworks
Customisation for existing ERPNext doctypes, creation of new custom doctypes, integration of
other APIs
Key Responsibilities:
- Involved in product development using Frappe based ERPNext, leveraging existing modules, customizations, and creating new modules
- Design, Configure, Build, Test, Deploy and Maintain ERPNext Platform
Desired Candidate Profile:
- Min 2 years of experience hands-on Product development over ERPNext, Frappe Framework or Flask
- Be a team player
- Energetic, self-motivated, and self-sufficient in accomplishing tasks
If you have nodded to each of these points, we already like you :)
CTO’s profile:
https://www.linkedin.com/in/gunjanjaswal/">Gunjan Jaswal - 12+ years of hands on experience across GMR Infrastructure, Scoopwhoop and his unique entrepreneurial streak
Founder’s profiles:
https://www.linkedin.com/in/henna-vij/">Henna Vij - MBA from IIM Bangalore, with 8 years of experience across HSBC, KPMG, Accenture Management Consulting
https://www.linkedin.com/in/prateek-agarwal-7654a8138/">Prateek Agarwal - Chartered Accountant with 10 years of experience with KPMG in expat compliances
Why should you join us
- Work on a first-of-its-kind product in a niche industry- Get a chance to make an impact in a hypergrowth setup early in your career
- Accelerate Career Growth- Being in smaller teams, you will get real ownership and responsibility.
- Great Work Culture & Benefits- We take a human-first approach in whatever we do. Our supportive culture allows you to strive and express yourself so that you really feel at home.
Job Description
Cyber Threat Intelligence & Threat Hunting - Subject Matter Expert (B3-2)
Responsibilities:
Perform threat research, create actionable threat advisories, and derive hunting queries based on the evolving threat vectors.
Understand APT groups, Conduct deep dive technical analysis of cyber-attack tools, tactics, and procedures. Create hypothesis and perform active threat hunting.
Minimum Requirements:
10+ years of overall experience, 7+ years of experience in cyber threat intelligence, malware analysis (Reverse engineering)
Hands-on experience with writing threat hunting hypothesis & active threat hunting
Experience with YARA rule and OpenIOC signature creation.
Experience with multi-tiered mission-critical systems.
Experience in opensource sandbox and honeypots.
Preferred Certification
GIAC Cyber Threat Intelligence (GCTI)
C| TIA (Certified Threat Intelligence Analyst)
CCTIA by the NICCS
1. Act as a first line support for IT department
2. Answer internal and external customer calls and emails
3,create service desk tickets, assign technician and follow up closure
4.Monitirng service desk dashoard and sending alert for SLA violations
5. end to end front line coordination
Skills - experience in interacting with US clients
basic understandign of IT set up and computing issues
ITIl V3 certification would be an added advantage
Please note that the shift timings will be 6pm to 3am and 8:30pm to 5:30am.
Home drop available
JOINING BONUS FOR EARLY JOINERS .









