About Product :
Instanodes:
Instanodes is a leading provider of Web3 infrastructure, offering robust and scalable solutions for blockchain developers and businesses. Our mission is to simplify the complexities of blockchain technology, enabling our clients to focus on building innovative and disruptive applications. We are passionate about empowering the decentralized future and are seeking a talented and driven Product Manager to join our team.
About the Role:
We are looking for an experienced and entrepreneurial Product Manager to lead the growth and development of Instanodes. You will be responsible for the overall product strategy, roadmap, and execution, with a keen focus on driving P&L, securing fundraises, and shaping the long-term vision of the product. You will work closely with engineering, marketing, sales, and leadership to ensure Instanodes remains at the forefront of the Web3 infrastructure landscape.
Responsibilities:
- Product Strategy & Roadmap: Define and champion the product vision, strategy, and roadmap for Instanodes, aligning with the company's overall goals and objectives.
- Market Analysis & Competitive Intelligence: Conduct in-depth market research, analyze competitor offerings (Zeeve, Alchemy, Quicknode, etc.), and identify opportunities for differentiation and growth.
- User Research & Customer Understanding: Deeply understand the needs and pain points of Web3 developers and businesses through user research, customer interviews, and data analysis.
- Product Development & Execution: Translate user needs into detailed product specifications and user stories, prioritize features, and manage the product backlog. Collaborate closely with engineering to ensure timely and high-quality product delivery.
- Go-to-Market Strategy & Launch Planning: Develop and execute go-to-market strategies for new product features and releases.
- Growth & Metrics: Define and track key performance indicators (KPIs) to measure product success. Analyze product usage data to identify areas for improvement and growth.
- P&L Management: Own the P&L for Instanodes, focusing on revenue generation, cost optimization, and profitability.
- Fundraising: Support fundraising efforts by developing compelling investment materials, presenting to investors, and contributing to due diligence processes.
- Team Leadership & Collaboration: Foster a collaborative and high-performing product team environment. Mentor and guide junior product team members.
Qualifications:
- 5+ years of experience in product management, with a focus on B2B SaaS or infrastructure products.
- Strong understanding of blockchain technology, Web3 concepts, and the decentralized ecosystem.
- Familiarity with the competitive landscape of Web3 infrastructure providers (Zeeve, Alchemy, Quicknode, etc.).
- Proven track record of successfully launching and growing products.
- Excellent analytical and problem-solving skills.
- Strong communication, interpersonal, and presentation skills.
- Experience with agile development methodologies.
- Experience with P&L management and fundraising is a plus.

About Antier Solutions Pvt. Ltd (Antech)
About
As experienced enterprise blockchain solutions provider, we specialize in enterprise Blockchain solutions development. Leverage our trusted Blockchain Solutions for the Enterprise.
Connect with the team
Similar jobs
Full job description
Job Title : Technical Content Writer
Experience : 1 - 2 years
Job Location : Madurai (On-site)
We are seeking an experienced candidate to support PhD scholars across all departments in high-quality academic paper writing.
Job Description:
- Strong skills in academic and technical writing, including experience in analysing the nature of the work.
- High level of accuracy and attention to detail in writing and editing.
- Good written and verbal communication skills.
- Ability to manage multiple projects and meet deadlines.
- Strong commitment to participation and excitement to recognize risk.
- Possess thorough knowledge of the education field and write the research materials according to client needs.
- Achieve and maintain relevant product knowledge.
- You will assist in creating high-quality academic content that aligns with our research initiatives and promotes our mission
Key Responsibilities:
- Conduct literature reviews and gather data from credible sources.
- Analyze research findings using statistical/analytical tools.
- Collaborate with researchers and faculty to refine methodologies.
- Write and edit academic/technical content (reports, papers, articles).
- Ensure accuracy, originality, and adherence to citation standards.
- Translate complex topics into clear, structured content.
- Maintain organized documentation and research databases.
- Deliver high-quality outputs within deadlines.
- Coordinate with editorial/research teams to meet project goals.
Roles & Responsibility:
- Collaborate effectively as a team player, quick learner, and proactive self-starter.
- Set clear goals, monitor team performance, and ensure timely delivery of quality research outputs.
- Conduct in-depth research on technology, innovation, and related fields.
- Review, edit, and refine content for clarity, coherence, and academic standards.
- Demonstrate expert-level English writing with strong technical and innovative content development skills.
- Maintain up-to-date knowledge of research activities and emerging trends.
- Coordinate with project managers to allocate resources, prioritize tasks, and meet deadlines.
- Exhibit strong verbal and written communication, technical writing, and editing skills.
Qualifications & Skills:
- Any Bachelor’s or Master’s degree
- Strong analytical and critical thinking.
- Academic writing and documentation skills.
- Familiarity with research tools (SPSS, Python, R, Excel, etc.).
- Knowledge of referencing styles and academic standards.
- Attention to detail and ability to work with complex data.
Why Join Us?
- Opportunity to work on diverse academic and technical projects.
- Growth-oriented work environment.
- Competitive pay and rewards based on performance.
- Opportunities for professional development and advancement.
- A workplace that encourages teamwork and inclusion
Key Skills & Requirements
- Strong proficiency in Java 11 and above
- Hands-on expertise in Spring Boot and Microservices Architecture
- Strong Programming, Analytical, and Problem-Solving skills
- Proficiency with NoSQL Databases (MongoDB, Cosmos DB) and RDBMS (SQL/Oracle/Postgres)
- Experience with Messaging Queues (RabbitMQ/Kafka)
- Good understanding of Functional and Domain knowledge
- Expertise in Project Architecture & Data Flows
- Proficient in CI/CD tools (Jenkins or equivalent)
- Strong experience with Testing Frameworks: JUnit, Mockito, Cucumber, BDD
- Basic knowledge of Cloud Platforms (Azure / AWS)
- Familiarity with Project Management Tools – JIRA, Confluence, ServiceNow
- Monitoring tools knowledge like New Relic, Splunk, Nagios (good to have)
Responsibilities
- Design, develop, and maintain scalable backend applications using Java 11+ and Spring Boot
- Build and manage microservices-based solutions ensuring high performance and low latency
- Collaborate with cross-functional teams to define, design, and ship new features
- Implement best practices for CI/CD pipelines and automate deployment workflows
- Ensure code quality through unit testing, integration testing, and BDD practices
- Work with databases (SQL/NoSQL) for data modeling and optimization
- Monitor, troubleshoot, and enhance system performance using tools like Splunk, New Relic, Nagios
- Provide technical guidance and mentorship to team members
Position: AWS Data Engineer
Experience: 5 to 7 Years
Location: Bengaluru, Pune, Chennai, Mumbai, Gurugram
Work Mode: Hybrid (3 days work from office per week)
Employment Type: Full-time
About the Role:
We are seeking a highly skilled and motivated AWS Data Engineer with 5–7 years of experience in building and optimizing data pipelines, architectures, and data sets. The ideal candidate will have strong experience with AWS services including Glue, Athena, Redshift, Lambda, DMS, RDS, and CloudFormation. You will be responsible for managing the full data lifecycle from ingestion to transformation and storage, ensuring efficiency and performance.
Key Responsibilities:
- Design, develop, and optimize scalable ETL pipelines using AWS Glue, Python/PySpark, and SQL.
- Work extensively with AWS services such as Glue, Athena, Lambda, DMS, RDS, Redshift, CloudFormation, and other serverless technologies.
- Implement and manage data lake and warehouse solutions using AWS Redshift and S3.
- Optimize data models and storage for cost-efficiency and performance.
- Write advanced SQL queries to support complex data analysis and reporting requirements.
- Collaborate with stakeholders to understand data requirements and translate them into scalable solutions.
- Ensure high data quality and integrity across platforms and processes.
- Implement CI/CD pipelines and best practices for infrastructure as code using CloudFormation or similar tools.
Required Skills & Experience:
- Strong hands-on experience with Python or PySpark for data processing.
- Deep knowledge of AWS Glue, Athena, Lambda, Redshift, RDS, DMS, and CloudFormation.
- Proficiency in writing complex SQL queries and optimizing them for performance.
- Familiarity with serverless architectures and AWS best practices.
- Experience in designing and maintaining robust data architectures and data lakes.
- Ability to troubleshoot and resolve data pipeline issues efficiently.
- Strong communication and stakeholder management skills.
Overall Experience: 6+ years
Relevant: 4+ years
Location: Initial 10 days Hyderabad after that Remote
*ONLY IMMEDIATE JOINERS*
Role Description
This is a full-time remote role for a Data Engineer. The Data Engineer will be responsible for daily tasks such as data engineering, data modeling, extract transform load (ETL), data warehousing, and data analytics. Collaboration and communication with cross-functional teams will be required to ensure successful project outcomes.
Qualifications
- SQL, Python/Scala, Spark, Hadoop, Hive, HDFS
- Data Engineering, Data Modeling, and Extract Transform Load (ETL) skills
- Data Warehousing and Data Analytics skills
- Experience in working with large datasets and data pipelines
- Proficiency in programming languages such as Python or SQL
- Knowledge of data integration and data processing tools
- Familiarity with cloud platforms and big data technologies
- Strong problem-solving and analytical skills
- Excellent communication and collaboration abilities
- Ability to work independently and remotely
- Bachelor's degree in Computer Science, Data Science, or a related field
About the role:
As a team member at TrusTrace, you’ll get to solve challenging, real-world problems that truly make a difference to society.
As a Product Developer at TrusTrace, you’ll get to solve challenging and real-time problems using cutting edge technologies. You get to work with industry thought leaders and big-name brands. You will work with the product team to materialize the requirements into pieces of user stories and priorities. You will get hands-on experience in polyglot programming to build solutions and write tests to ensure quality code. (We primarily work on Java, typescript/node, and golang). If you build it, you will own it, i.e . you will be generating metrics, track improvements, and bug fixes for features that are built and shipped by you.
Experience & Skills: (3 – 6 Years) The successful candidate will have
- Passion for problem-solving.
- Flexibility to multitask and re-priorities when necessary.
- Ability to work in an agile and customer-eccentric team.
- Open to learning new technologies.
- Prior saas/start-up experience is preferable, but not mandatory.
- Hands-on experience in back-end technologies and have a strong understanding of core data structure and design patterns.
- Commanding knowledge in HLD/LLD
- Basics of system design and distributed systems
- Strong fundamentals in frameworks like Spring MVC, Spring Security, Spring Data, and Spring-boot.
- Sound knowledge in Java and JVM ecosystem.
- Write the production-grade test code. Junit/ TestNG
- Proficiency with Rest API performance and OpenAPI standards.
- Experience in building cache layers and invalidate them properly ;-)
- Strong knowledge of NoSQL(Mongo) database.
- Experience with Elastic search or Solr is preferred.
- Basics understanding of cloud infrastructures (Preferably AWS) and CI/CD pipelines.
- Should have startup/product Experience
Educational Qualification : Preferably from CS/Circuit Background
Role & responsibilities
The Telesales & Service Associate of NAVEEN COMPANIES sells IT Services & Products (Website, Digital Marketing, Ecommerce, Mobile App) over phone to clients who are looking for IT Services and Products. The job is to reach out to various customers over phone, explain them the product & features and deliver value to the customers.
· Responsible for making outbound calls and regular follow up on leads assigned.
· Build a strong relationship / trust with customers by understanding the customer requirements and suggesting the right product to the Customers.
· Selling IT Services & Products over the phone and achieve the targeted sales numbers & value, set quality parameters
· Explaining the Services & products features in detail and how to use the packages.
· Understand the Customer requirements,
· Address all the issues / grievances of customers and provide the right customer experience.
· Capture the insights from customer interactions and share it with the internal team.
· Strictly adhere to the process requirements
Preferred candidate profile
· Business Knowledge Process Knowledge
· Product Knowledge
· Customer Orientation
· Language skills English / Telugu
· Telemarketing / Selling Skills
· Basic computer skills MS Excel, Word
Behavioral
· Values – Integrity / Honesty / Respect
· Communication – Oral & Written skills / Listening
· Executive Presence
· Business / Telephone Etiquette
· Influencing Skills
· Empathy
· Self-driven / Initiative
· Ability to manage stress
· Willingness to learn
Education and Experience
· Graduate in any stream
· 0-12 months experience in telesales / collections / retention in BFSI / BPO / Telecom / Insurance / Timeshare sectors
About Kloud9:
Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.
Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.
At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.
Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.
We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.
What we are looking for:
● 3+ years’ experience developing Big Data & Analytic solutions
● Experience building data lake solutions leveraging Google Data Products (e.g. Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.), Hive, Spark
● Experience with relational SQL/No SQL
● Experience with Spark (Scala/Python/Java) and Kafka
● Work experience with using Databricks (Data Engineering and Delta Lake components)
● Experience with source control tools such as GitHub and related dev process
● Experience with workflow scheduling tools such as Airflow
● In-depth knowledge of any scalable cloud vendor(GCP preferred)
● Has a passion for data solutions
● Strong understanding of data structures and algorithms
● Strong understanding of solution and technical design
● Has a strong problem solving and analytical mindset
● Experience working with Agile Teams.
● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
● Able to quickly pick up new programming languages, technologies, and frameworks
● Bachelor’s Degree in computer science
Why Explore a Career at Kloud9:
With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers!
* Minimum 2 year relevant experience in .Net or .Net core server-side programming as last
working relevant experience.
* Basic Engineering Knowledge – OOPS , Data Structure , memory management etc..
* Hands on experience in .Net, WCF (optional) and windows concept
* Hands on experience in SQL – basic are must to have
* Requirement analysis and design principle understanding is required
* Good in unit testing and debugging concepts
* Should be good in communication.
Mandatory skillset - SQL, REST API / GRPC
Job Location-Gurgaon (Hybrid)
Early joiners with a notice period of 15 days to 1 month will be preferred.
- Use data to develop machine learning models that optimize decision making in Credit Risk, Fraud, Marketing, and Operations
- Implement data pipelines, new features, and algorithms that are critical to our production models
- Create scalable strategies to deploy and execute your models
- Write well designed, testable, efficient code
- Identify valuable data sources and automate collection processes.
- Undertake to preprocess of structured and unstructured data.
- Analyze large amounts of information to discover trends and patterns.
Requirements:
- 2+ years of experience in applied data science or engineering with a focus on machine learning
- Python expertise with good knowledge of machine learning libraries, tools, techniques, and frameworks (e.g. pandas, sklearn, xgboost, lightgbm, logistic regression, random forest classifier, gradient boosting regressor, etc)
- strong quantitative and programming skills with a product-driven sensibility











