
What We’re Looking For
• Hands-on experience in keyword research, competitor analysis and content gap identification
• Strong understanding of on-page SEO: meta tags, schema, internal linking, URL structure and content optimization
• Experience managing technical SEO: site audits, crawling issues, indexing, page speed and mobile optimization
• Ability to plan and execute backlink strategies using safe, high-quality methods
• Familiarity with tools like Google Search Console, Google Analytics, Ahrefs, SEMrush, Screaming Frog
• Experience working with blogs, landing pages and long-form content
• Ability to coordinate with writers, developers and designers for SEO requirements
• Proven experience in improving rankings for competitive keywords
• Understanding of local SEO and structured data markup
• Comfortable working in a fast-moving, bootstrapped startup environment
• Bonus: Experience with Django or basic HTML/CSS is useful but not mandatory
What You Will Work On
• Improving rankings for keywords like “Top boarding schools in India”, “Best boarding schools”, etc.
• Conducting monthly audits and pushing technical SEO fixes
• Growing EduPowerPro’s organic traffic through structured content planning
• Managing backlink acquisition and partnerships
• Tracking performance and presenting monthly insights

About EDUPOWERPO SOLUTIONS
About
About EduPowerPro
EduPowerPro is an education guidance platform that helps parents discover, compare and select the best boarding schools in India. Families use our platform to explore schools by fees, category, location and facilities. We also offer personalised counselling so parents can make decisions with clarity and confidence.
What We Do
• Build detailed school profiles with verified information
• Offer search, filtering and comparison tools
• Provide expert counselling and admission support
• Help schools improve their digital presence and connect with the right parents
Our Mission
To make school selection simple and transparent for every parent by combining trustworthy data with human guidance.
Who Uses EduPowerPro
Parents from across India and abroad searching for CBSE, ICSE, IB, IGCSE, girls, boys, co-ed and full boarding schools. Schools also collaborate with us for listings, digital outreach and lead management.
Team & Culture
We work like a small, focused team that moves fast and learns quickly. The culture is practical, ownership-driven and friendly. Everyone works with clear goals and the freedom to recommend better ways of doing things.
Why Work With Us
• Opportunity to build India’s most trusted K–12 school discovery platform
• Work on real problems that impact families directly
• Fast decision-making and flexible work culture
• Space to innovate in EdTech, counselling, marketing and operations
What We Look For
People who are curious, reliable and comfortable working in a structured but evolving startup environment. Good communication skills and a problem-solving mindset matter more than rigid experience.
Website
Contact
+91 9839382045
Candid answers by the company
Anywhere
Similar jobs
Position : Senior Data Analyst
Experience Required : 5 to 8 Years
Location : Hyderabad or Bangalore (Work Mode: Hybrid – 3 Days WFO)
Shift Timing : 11:00 AM – 8:00 PM IST
Notice Period : Immediate Joiners Only
Job Summary :
We are seeking a highly analytical and experienced Senior Data Analyst to lead complex data-driven initiatives that influence key business decisions.
The ideal candidate will have a strong foundation in data analytics, cloud platforms, and BI tools, along with the ability to communicate findings effectively across cross-functional teams. This role also involves mentoring junior analysts and collaborating closely with business and tech teams.
Key Responsibilities :
- Lead the design, execution, and delivery of advanced data analysis projects.
- Collaborate with stakeholders to identify KPIs, define requirements, and develop actionable insights.
- Create and maintain interactive dashboards, reports, and visualizations.
- Perform root cause analysis and uncover meaningful patterns from large datasets.
- Present analytical findings to senior leaders and non-technical audiences.
- Maintain data integrity, quality, and governance in all reporting and analytics solutions.
- Mentor junior analysts and support their professional development.
- Coordinate with data engineering and IT teams to optimize data pipelines and infrastructure.
Must-Have Skills :
- Strong proficiency in SQL and Databricks
- Hands-on experience with cloud data platforms (AWS, Azure, or GCP)
- Sound understanding of data warehousing concepts and BI best practices
Good-to-Have :
- Experience with AWS
- Exposure to machine learning and predictive analytics
- Industry-specific analytics experience (preferred but not mandatory)
Springer Capital is a cross-border asset management firm specializing in real estate investment banking between China and the USA. We are offering a remote internship for aspiring operations and technology interns interested in data modeling, workflow automation, cloud platforms, and business intelligence. The internship offers flexible start and end dates. A short quiz or technical task may be required as part of the selection process.
Responsibilities:
- Assist with day-to-day operations and administrative tasks across technology and HR functions.
- Support data modeling and validation tasks to improve business tracking.
- Collaborate on ETL processes and help streamline data collection from internal and external systems.
- Work with cloud platforms (AWS, GCP, Azure) and databases (PostgreSQL, MySQL, MongoDB) for basic data-related tasks.
- Partner with analysts and developers to improve data accessibility and operational efficiency.
- Document workflows, maintain trackers, and update internal systems to ensure process accuracy.
- Assist with troubleshooting and process improvement initiatives.
Please send your resume to talent@springercapital
* Good knowledge of Node.js/ NextJS, Express.js, React and MongoDB
* Need a clear understanding of JavaScript and Typescript.
* Excellent grasp of data structures and designing and developing REST APIs.
* Good skills of either RDBMS (e.g. MySQL or PostgreSQL) or NoSQL (MongoDB or equivalent).
* Must have at least 3 years’ experience in MERN Stack development.
* Experience in developing responsive web applications.
* Good communication skills.
* Sound understanding of Agile and Scrum methodologies and ability to participate in local and remote Sprints.
* Good grasp of UI / UX concepts.
* Should have experience in using Git and VSCode.
* Knowledge of AWS, Azure, CI / CD, Gitflow, shell scripting will be considered positively.
We are looking for a skilled and motivated Data Engineer with strong experience in Python programming and Google Cloud Platform (GCP) to join our data engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable ETL (Extract, Transform, Load) data pipelines. The role involves working with various GCP services, implementing data ingestion and transformation logic, and ensuring data quality and consistency across systems.
Key Responsibilities:
- Design, develop, test, and maintain scalable ETL data pipelines using Python.
- Work extensively on Google Cloud Platform (GCP) services such as:
- Dataflow for real-time and batch data processing
- Cloud Functions for lightweight serverless compute
- BigQuery for data warehousing and analytics
- Cloud Composer for orchestration of data workflows (based on Apache Airflow)
- Google Cloud Storage (GCS) for managing data at scale
- IAM for access control and security
- Cloud Run for containerized applications
- Perform data ingestion from various sources and apply transformation and cleansing logic to ensure high-quality data delivery.
- Implement and enforce data quality checks, validation rules, and monitoring.
- Collaborate with data scientists, analysts, and other engineering teams to understand data needs and deliver efficient data solutions.
- Manage version control using GitHub and participate in CI/CD pipeline deployments for data projects.
- Write complex SQL queries for data extraction and validation from relational databases such as SQL Server, Oracle, or PostgreSQL.
- Document pipeline designs, data flow diagrams, and operational support procedures.
Required Skills:
- 4–8 years of hands-on experience in Python for backend or data engineering projects.
- Strong understanding and working experience with GCP cloud services (especially Dataflow, BigQuery, Cloud Functions, Cloud Composer, etc.).
- Solid understanding of data pipeline architecture, data integration, and transformation techniques.
- Experience in working with version control systems like GitHub and knowledge of CI/CD practices.
- Strong experience in SQL with at least one enterprise database (SQL Server, Oracle, PostgreSQL, etc.).
About Dapi:
Dapi is a financial technology infrastructure provider that allows businesses to accept and initiate bank transfers at nearly zero fees. We work to remove intermediaries thereby allowing our customers to deliver alternative payment solutions to their end-users. We are building a new financial infrastructure that will enable every company to offer its own embedded financial services. Within one year of our launch, we have managed to attract investments from Y Combinator, Pioneer Fund, and other notable VC funds. Read more about us in TechCrunch and Crunchbase.
What you’ll do
You will have the opportunity to use our leading proprietary tools to web scrape and integrate banks into our system. You will be exposed to a fast-paced, collaborative, and diverse work environment where you will get to see the real-world impact you make right away. We’re a team that openly welcomes new ideas, you will have the opportunity to present your own ideas and work on them once approved by the team.
What’s required
We believe that with a certain coding background, hard skills required for the job can be learned. However, soft skills like drive, passion, and commitment to learning cannot. Therefore, we are keen on finding candidates with the following inclinations:
- Tenacious and hardworking, with the ability to persistently work on a problem without losing sight of the end goal
- Ability to work in a fast-paced, high pressure environment
- Previous experience with TypeScript and Puppeteer is preferred
- Previous experience with web scraping is preferred
We are looking for a result-driven b2b Diamond and Diamond Jewellery Marketing for Overseas with good experience in certified and non certified loose diamonds and have knowledge of Lab Grown diamond. Candidates to be responsible for all sales job duties, from generating leads to closing sales in overseas and develop contacts from his own in USA & European. Duties and responsibilities include working closely with customers to determine their needs, answer their questions about our products and recommend the right merchandise. You should be able to promptly resolve customer complaints and ensure maximum client satisfaction. Analysis of the needs of existing/potential customers to meet their requirements.
A good understanding of Loose Diamonds, Certified Diamonds and Diamonds & Diamonds Jewellery and experience in the required field is most important. Knowledge of Natural & Lab grown diamonds. Exposure of the overseas market is also necessary.
For Overseas Diamond & Jewellery Marketing the candidates with good USA & European marketing & contacts will do. Have to work in Night and evening shift.
- Solve complex Cloud Infrastructure problems.
- Drive DevOps culture in the organization by working with engineering and product teams.
- Be a trusted technical advisor to developers and help them architect scalable, robust, and highly-available systems.
- Frequently collaborate with developers to help them learn how to run and maintain systems in production.
- Drive a culture of CI/CD. Find bottlenecks in the software delivery pipeline. Fix bottlenecks with developers to help them deliver working software faster. Develop and maintain infrastructure solutions for automation, alerting, monitoring, and agility.
- Evaluate cutting edge technologies and build PoCs, feasibility reports, and implementation strategies.
- Work with engineering teams to identify and remove infrastructure bottlenecks enabling them to move fast. (In simple words you'll be a bridge between tech, operations & product)
Skills required:
Must have:
- Deep understanding of open source DevOps tools.
- Scripting experience in one or more among Python, Shell, Go, etc.
- Strong experience with AWS (EC2, S3, VPC, Security, Lambda, Cloud Formation, SQS, etc)
- Knowledge of distributed system deployment.
- Deployed and Orchestrated applications with Kubernetes.
- Implemented CI/CD for multiple applications.
- Setup monitoring and alert systems for services using ELK stack or similar.
- Knowledge of Ansible, Jenkins, Nginx.
- Worked with Queue based systems.
- Implemented batch jobs and automated recurring tasks.
- Implemented caching infrastructure and policies.
- Implemented central logging.
Good to have:
- Experience dealing with PI information security.
- Experience conducting internal Audits and assisting External Audits.
- Experience implementing solutions on-premise.
- Experience with blockchain.
- Experience with Private Cloud setup.
Required Experience:
- B.Tech. / B.E. degree in Computer Science or equivalent software engineering degree/experience.
- You need to have 2-4 years of DevOps & Automation experience.
- Need to have a deep understanding of AWS.
- Need to be an expert with Git or similar version control systems.
- Deep understanding of at least one open-source distributed systems (Kafka, Redis, etc)
- Ownership attitude is a must.
We offer a suite of memberships and subscriptions to spice up your lifestyle. We believe in practicing an ultimate work life balance and satisfaction. Working hard doesn’t mean clocking in extra hours, it means having a zeal to contribute the best of your talents. Our people culture helps us inculcate measures and benefits which help you feel confident and happy each and every day. Whether you’d like to skill up, go off the grid, attend your favourite events or be an epitome of fitness. We have you covered round and about.
- Health Memberships
- Sports Subscriptions
- Entertainment Subscriptions
- Key Conferences and Event Passes
- Learning Stipend
- Team Lunches and Parties
- Travel Reimbursements
- ESOPs
Thats what we think would bloom up your personal life, as a gesture for helping us with your talents.
Join us to be a part of our Exciting journey to Build one Digital Identity Platform!!!
Job Description
Niki is an artificially intelligent ordering application (http://niki.ai/app" target="_blank">niki.ai/app). Our founding team is from IIT Kharagpur, and we are looking for a Natural Language Processing Engineer to join our engineering team.
The ideal candidate will have industry experience solving language-related problems using statistical methods on vast quantities of data available from Indian mobile consumers and elsewhere.
Major responsibilities would be:
1. Create language models from text data. These language models draw heavily from statistical, deep learning as well as rule based research in recent times around building taggers, parsers, knowledge graph based dictionaries etc.
2. Develop highly scalable classifiers and tools leveraging machine learning, data regression, and rules based models
3. Work closely with product teams to implement algorithms that power user and developer-facing products
We work mostly in Java and Python and object oriented concepts are a must to fit in the team. Basic eligibility criteria are:
1. Graduate/Post-Graduate/M.S./
2. Industry experience of min 5 years.
3. Strong background in Natural Language Processing and Machine Learning
4. Have some experience in leading a team big or small.
5. Experience with Hadoop/Hbase/Pig or MaprReduce/Sawzall/Bigtable is a plus
Competitive Compensation.
What We're Building
We are building an automated messaging platform to simplify ordering experience for consumers. We have launched the Android App: http://niki.ai/app" target="_blank">niki.ai/app . In the current avatar, Niki can process mobile phone recharge and book cabs for the consumers. It assists in finding the right recharge plans across topup, 2g, 3g and makes the transaction. In cab booking, it helps in end to end booking along with tracking and cancellation within the App. You may also compare to get the nearest or the cheapest cab among available ones.
Being an instant messaging App, it works seamlessly on 2G / 3G / Wifi and is light weight around 3.6 MB. You may check out using: https://niki.ai/" target="_blank">niki.ai app













