
SEO Executive Job Description
- We are looking for a result-driven SEO executive to be responsible for developing optimized web content. The SEO executive's responsibilities include working closely with the marketing team to achieve SEO objectives, measuring the success of SEO and ROI, and assisting with the maintenance of the website's architecture to ensure user friendliness.
- To be successful as an SEO executive, you should have strong copywriting and analytical skills, knowledge of coding techniques, and a commitment to constantly improving on key skills. Ultimately, an SEO executive should have outstanding knowledge of SEO, passion for the industry and time management skills.
SEO Executive Responsibilities:
- Conducting on-site and off-site analysis of web SEO competition.
- Using google analytics to conduct performance reports regularly.
- Creating high-quality SEO content.
- Assisting with blog content.
- Leading keyword research and optimization of content.
- Keeping up-to-date with developments in SEM.
IT Company
Full Time Office Job
Salary - 10k to 14k in hand
Experience - 1 to 2 year

About Vega Moon Technologies
About
Vega Moon Technologies is a leading website design and development and digital marketing company whose main goal is to grow every entity that partners with us. Be it an astrology-based organization or manufacturing-based establishment or IT-based corporation, all such agencies can be taken to the next level with the knowledge and expertise of the experts working at Vega Moon Technologies.
Photos
Similar jobs
Review Criteria
- Strong Data / ETL Test Engineer
- 5+ years of overall experience in Testing/QA
- 3+ years of hands-on end-to-end data testing/ETL testing experience, covering data extraction, transformation, loading validation, reconciliation, working across BI / Analytics / Data Warehouse / e-Governance platforms
- Must have strong understanding and hands-on exposure to Data Warehouse concepts and processes, including fact & dimension tables, data models, data flows, aggregations, and historical data handling.
- Must have experience in Data Migration Testing, including validation of completeness, correctness, reconciliation, and post-migration verification from legacy platforms to upgraded/cloud-based data platforms.
- Must have independently handled test strategy, test planning, test case design, execution, defect management, and regression cycles for ETL and BI testing
- Hands-on experience with ETL tools and SQL-based data validation is mandatory (Working knowledge or hands-on exposure to Redshift and/or Qlik will be considered sufficient)
- Must hold a Bachelor’s degree B.E./B.Tech else should have master's in M.Tech/MCA/M.Sc/MS
- Must demonstrate strong verbal and written communication skills, with the ability to work closely with business stakeholders, data teams, and QA leadership
- Mandatory Location: Candidate must be based within Delhi NCR (100 km radius)
Preferred
- Relevant certifications such as ISTQB or Data Analytics / BI certifications (Power BI, Snowflake, AWS, etc.)
Job Specific Criteria
- CV Attachment is mandatory
- Do you have experience working on Government projects/companies, mention brief about project?
- Do you have experience working on enterprise projects/companies, mention brief about project?
- Please mention the names of 2 key projects you have worked on related to Data Warehouse / ETL / BI testing?
- Do you hold any ISTQB or Data / BI certifications (Power BI, Snowflake, AWS, etc.)?
- Do you have exposure to BI tools such as Qlik?
- Are you willing to relocate to Delhi and why (if not from Delhi)?
- Are you available for a face-to-face round?
Role & Responsibilities
- 5 years’ experience in Data Testing across BI Analytics platforms with at least 2 largescale enterprise Data Warehouse Analytics eGovernance programs
- Proficiency in ETL Data Warehouse and BI report dashboard validation including test planning data reconciliation acceptance criteria definition defect triage and regression cycle management for BI landscapes
- Proficient in analyzing business requirements and data mapping specifications BRDs Data Models Source to Target Mappings User Stories Reports Dashboards to define comprehensive test scenarios and test cases
- Ability to review high level and low-level data models ETL workflows API specifications and business logic implementations to design test strategies ensuring accuracy consistency and performance of data pipelines
- Ability to test and validate the migrated data from old platform to an upgraded platform and ensure the completeness and correctness of migration
- Experience of conducting test of migrated data and defining test scenarios and test cases for the same
- Experience with BI tools like Qlik ETL platforms Data Lake platforms Redshift to support end to end validation
- Exposure to Data Quality Metadata Management and Data Governance frameworks ensuring KPIs metrics and dashboards align with business expectations
Ideal Candidate
- 5 years’ experience in Data Testing across BI Analytics platforms with at least 2 largescale enterprise Data Warehouse Analytics eGovernance programs
- Proficiency in ETL Data Warehouse and BI report dashboard validation including test planning data reconciliation acceptance criteria definition defect triage and regression cycle management for BI landscapes
- Proficient in analyzing business requirements and data mapping specifications BRDs Data Models Source to Target Mappings User Stories Reports Dashboards to define comprehensive test scenarios and test cases
- Ability to review high level and low-level data models ETL workflows API specifications and business logic implementations to design test strategies ensuring accuracy consistency and performance of data pipelines
- Ability to test and validate the migrated data from old platform to an upgraded platform and ensure the completeness and correctness of migration
- Experience of conducting test of migrated data and defining test scenarios and test cases for the same
- Experience with BI tools like Qlik ETL platforms Data Lake platforms Redshift to support end to end validation
- Exposure to Data Quality Metadata Management and Data Governance frameworks ensuring KPIs metrics and dashboards align with business expectations
Job Details
- Job Title: Enterprise Sales Manager (B2B SaaS)
- Industry: Software Technology Company
- Experience Required: 2-10 years
- Working Days: 5 days/week
- Job Location: Mumbai
- CTC Range: Best in Industry
Review Criteria
- Strong enterprise sales executive profile
- 2+ years of selling B2B SaaS.
- Must have 2+ of experience of selling to enterprise clients OR to manufacturing industry OR selling FinTech product, SAP Product sales/Finance ERP solutions (like invoice processing, vendor management, Source to pay, Compliance solutions).
- Must have experience in end-to-end sales from lead generation, prospecting, demos, proposal building, negotiation, and deal closure
- Must have stable career history — no frequent job hopping
- Final round is F2F (client will handle the travel)
Role & Responsibilities
We are looking for a dynamic and results-driven Enterprise Sales Manager to drive our sales strategy and expand our market presence. This role demands a strong understanding of SAP/Finance ERP solutions, excellent communication skills, and a proven track record in IT/software sales.
Key Responsibilities:
- Sales Strategy Development: Develop and execute sales plans to achieve company revenue targets in the SAP/ERP domain.
- Client Acquisition: Identify, engage, and convert prospective clients by demonstrating the value of our SAP/ERP solutions.
- Relationship Management: Build and maintain long-term relationships with clients, ensuring high levels of satisfaction and retention.
- Market Analysis: Stay updated on industry trends, competitor activities, and market demands to identify growth opportunities.
- Proposal & Presentation: Prepare and deliver compelling proposals, presentations, and demos tailored to client needs.
- Collaboration: Work closely with technical and consulting teams to ensure seamless delivery of solutions and services.
Ideal Candidate
- Experience: Minimum 2 years in sales, with a strong focus on SAP Product sales/Finance ERP solutions
- Industry Preference: Candidates with prior experience in handling manufacturing industry clients will be given preference.
- Educational Qualification: Bachelor’s degree in Business, IT, or a related field. An MBA is an added advantage.
Skills:
- Proven ability to meet and exceed sales targets.
- Excellent communication, negotiation, and presentation skills.
- Understanding of SAP/ERP systems and their applications in business processes.
- Strong client relationship management abilities.
- Track record of success managing large enterprise accounts
- Track record of consistently over-achieving quota (top 10% in your company)
- Strong interpersonal and presentation skills
- Exceptional verbal and written communication skills
- Ability to travel to prospects and customers if required
- Good organizer with the ability to prioritize and multitask
- Proven ability to manage multiple concurrent sales cycles.
🤖 Data Scientist – Frontier AI for Data Platforms & Distributed Systems (4–8 Years)
Experience: 4–8 Years
Location: Bengaluru (On-site / Hybrid)
Company: Publicly Listed, Global Product Platform
🧠 About the Mission
We are building a Top 1% AI-Native Engineering & Data Organization — from first principles.
This is not incremental improvement.
This is a full-stack transformation of a large-scale enterprise into an AI-native data platform company.
We are re-architecting:
- Legacy systems → AI-native architectures
- Static pipelines → autonomous, self-healing systems
- Data platforms → intelligent, learning systems
- Software workflows → agentic execution layers
This is the kind of shift you would expect from companies like Google or Microsoft —
Except here, you will build it from day zero and scale it globally.
🧠 The Opportunity: This role sits at the intersection of three high-impact domains:
1. Frontier AI Systems: Large Language Models (LLMs), Small Language Models (SLMs), and Agentic AI
2. Data Platforms: Warehouses, Lakehouses, Streaming Systems, Query Engines
3. Distributed Systems: High-throughput, low-latency, multi-region infrastructure
We are building systems where:
- Data platforms optimize themselves using ML/LLMs
- Pipelines are autonomous, self-healing, and adaptive
- Queries are generated, optimized, and executed intelligently
- Infrastructure learns from usage and evolves continuously
This is: AI as the control plane for data infrastructure
🧩 What You’ll Work On
You will design and build AI-native systems deeply embedded inside data infrastructure.
1. AI-Native Data Platforms
- Build LLM-powered interfaces:
- Natural language → SQL / pipelines / transformations
- Design semantic data layers:
- Embeddings, vector search, knowledge graphs
- Develop AI copilots:
- For data engineers, analysts, and platform users
2. Autonomous Data Pipelines
- Build self-healing ETL/ELT systems using AI agents
- Create pipelines that:
- Detect anomalies in real time
- Automatically debug failures
- Dynamically optimize transformations
3. Intelligent Query & Compute Optimization
- Apply ML/LLMs to:
- Query planning and execution
- Cost-based optimization using learned models
- Workload prediction and scheduling
- Build systems that:
- Learn from query patterns
- Continuously improve performance and cost efficiency
4. Distributed Data + AI Infrastructure
- Architect systems operating at:
- Billions of events per day
- Petabyte-scale data
- Work with:
- Distributed compute engines (Spark / Flink / Ray class systems)
- Streaming systems (Kafka-class infra)
- Vector databases and hybrid retrieval systems
5. Learning Systems & Feedback Loops
- Build closed-loop AI systems:
- Execution → feedback → model updates
- Develop:
- Continual learning pipelines
- Online learning systems for infra optimization
- Experimentation frameworks (A/B, bandits, eval pipelines)
6. LLM & Agentic Systems (Infra-Aware)
- Build agents that understand data systems
- Enable:
- Autonomous pipeline debugging
- Root cause analysis for infra failures
- Intelligent orchestration of data workflows
🧠 What We’re Looking For
Core Foundations
- Strong grounding in:
- Machine Learning, Deep Learning, NLP
- Statistics, optimization, probabilistic systems
- Distributed systems fundamentals
- Deep understanding of:
- Transformer architectures
- Modern LLM ecosystems
Hands-On Expertise
- Experience building:
- LLM / GenAI systems (RAG, fine-tuning, embeddings)
- Data platforms (warehouse, lake, lakehouse architectures)
- Distributed pipelines and compute systems
- Strong programming skills:
- Python (ML/AI stack)
- SQL (deep understanding — query planning, optimization mindset)
Systems Thinking (Critical)
You think in systems, not components.
- Built or worked on:
- Large-scale data pipelines
- High-throughput distributed systems
- Low-latency, high-concurrency architectures
- Understand:
- Query optimization and execution
- Data partitioning, indexing, caching
- Trade-offs in distributed systems
🔥 What Sets You Apart (Top 1%)
- Built AI-powered data platforms or infra systems in production
- Designed or contributed to:
- Query engines / optimizers
- Data observability / lineage systems
- AI-driven infra or AIOps platforms
- Experience with:
- Multi-modal AI (logs, metrics, traces, text)
- Agentic AI systems
- Autonomous infrastructure
- Worked on systems at scale comparable to:
- Google (BigQuery-like systems)
- Meta (real-time analytics infra)
- Snowflake / Databricks (lakehouse architectures)
🧬 Ideal Background (Not Mandatory)
We often see strong candidates from:
- Data infrastructure or platform engineering teams
- AI-first startups or research-driven environments
- High-scale product companies
Experience building:
- Internal platforms used by 1000s of engineers
- Systems serving millions of users / high throughput workloads
- Multi-region, distributed cloud systems
🧠 The Kind of Problems You’ll Solve
- Can LLMs replace traditional query optimizers?
- How do we build self-healing data pipelines at scale?
- Can data systems learn from every query and improve automatically?
- How do we embed reasoning and planning into infrastructure layers?
- What does a fully autonomous data platform look like?
Background: We Commonly See (But Not Limited To)
Our team often includes engineers from top-tier institutions and strong research or product backgrounds, including:
- Leading engineering schools in India and globally
- Engineers with experience in top product companies, AI startups, or research-driven environments
- That said, we care far more about demonstrated ability, depth, and impact than pedigree alone.
Key Responsibilities
- Design, develop, and maintain mobile automation frameworks using Appium
- Automate native, hybrid, and mobile web applications (Android & iOS)
- Write, execute, and maintain automated test scripts
- Perform regression, smoke, and sanity testing
- Integrate automation scripts into CI/CD pipelines
- Analyze test results and report defects using bug tracking tools
- Collaborate with developers, product owners, and QA team
- Participate in test planning and test case review
Note : Face 2 Face interview is mandatory on weekdays
Responsibilities & Duties:
Making Calls to Customers, Maintaining the Database
Lead Verification
Handle client queries/escalations.
Maintain relations with the existing clients and ask to upgrade their packages or renewals
Required Experience, Skills, and Qualifications
Experience: Fresher
Skills: Proficient Written & Verbal Communication Skills in English
Qualifications: Any Graduate
Job Description for Java:
• Experience in JAVA programming and application development using Spring MVC, Spring Boot, Spring Security, Hibernate and Microservices
• Experience in building products with Full stack technologies, with excellent understanding of computer science fundamentals, data structures, algorithms, OOPs and OOA/D
• Experience in MySQL, Mongo dB (or other NOSQL db’s), REST, Web Sockets, JavaScript, Ajax
• Experience with GO a plus
• Experience with one of JavaScript Frameworks (React, Angular, Vue etc.)
• Experience in Rest API development using Spring MVC or Spring Boot
• Able to define, design, implement complex, and scalable systems
• Good team player and communication skills
• Experience with agile development methodologies with Test Driven Development (TDD)
• Experience with Jenkins Setup CI/CD (Continuous Integration / Continuous delivery)
Job Description
- Identifies business opportunities by identifying prospects and evaluating their position in the industry; researching and analyzing sales options.
- Sells products by establishing contact and developing relationships with prospects; recommending solutions.
- Maintains relationships with clients by providing support, information, and guidance; researching and recommending new opportunities; recommending profit and service improvements.
- Identifies product improvements or new products by remaining current on industry trends, market activities, and competitors.
- Prepares reports by collecting, analyzing, and summarizing information.
- Maintains quality service by establishing and enforcing organization standards.
- Contributes to team effort by accomplishing related results as needed.
- Team Handling.
Skills Required:
- Strong communication
- Strong marketing skills including negotiation and convincing
- Good oral/written/interpersonal/presentation skills.
- Possesses an energetic, outgoing and friendly demeanor.












