
CoinFantasy is looking for an experienced Senior AI Architect to lead both the decentralised protocol development and the design of AI-driven applications on this network. As a visionary in AI and distributed computing, you will play a central role in shaping the protocol’s technical direction, enabling efficient task distribution, and scaling AI use cases across a heterogeneous, decentralised infrastructure.
Job Responsibilities
- Architect and oversee the protocol’s development, focusing on dynamic node orchestration, layer-wise model sharding, and secure, P2P network communication.
- Drive the end-to-end creation of AI applications, ensuring they are optimised for decentralised deployment and include use cases with autonomous agent workflows.
- Architect AI systems capable of running on decentralised networks, ensuring they balance speed, scalability, and resource usage.
- Design data pipelines and governance strategies for securely handling large-scale, decentralised datasets.
- Implement and refine strategies for swarm intelligence-based task distribution and resource allocation across nodes. Identify and incorporate trends in decentralised AI, such as federated learning and swarm intelligence, relevant to various industry applications.
- Lead cross-functional teams in delivering full-precision computing and building a secure, robust decentralised network.
- Represent the organisation’s technical direction, serving as the face of the company at industry events and client meetings.
Requirements
- Bachelor’s/Master’s/Ph.D. in Computer Science, AI, or related field.
- 12+ years of experience in AI/ML, with a track record of building distributed systems and AI solutions at scale.
- Strong proficiency in Python, Golang, and machine learning frameworks (e.g., TensorFlow, PyTorch).
- Expertise in decentralised architecture, P2P networking, and heterogeneous computing environments.
- Excellent leadership skills, with experience in cross-functional team management and strategic decision-making.
- Strong communication skills, adept at presenting complex technical solutions to diverse audiences.
About Us
CoinFantasy is a Play to Invest platform that brings the world of investment to users through engaging games. With multiple categories of games, it aims to make investing fun, intuitive, and enjoyable for users. It features a sandbox environment in which users are exposed to the end-to-end investment journey without risking financial losses.
Building on this foundation, we are now developing a groundbreaking decentralised protocol that will transform the AI landscape.
Website:
Benefits
- Competitive Salary
- An opportunity to be part of the Core team in a fast-growing company
- A fulfilling, challenging and flexible work experience
- Practically unlimited professional and career growth opportunities

Similar jobs
Data Engineer
Overview
We are seeking skilled Data Engineers to join our Data & Digital Twin Foundation team. You will design, build, and maintain data pipelines that power digital twin platforms, real-time operational systems, and AI/ML workloads. Working closely with data architects, simulation engineers, and ML teams, you will transform raw operational data into high-quality, governed datasets that drive intelligent decision-making.
Our core data platform stack includes:
Data Platform & Lakehouse
- Databricks as the single point of truth for all data
- Realtime Data Pipelines implemented using Kafka for data ingestion.
- Databricks SQL for analytical queries
- Unity Catalog for metadata management and governance
- Terradata for data warehouse and business intelligence.
Stream & Event Processing
- Apache Kafka for real-time event ingestion
- Structured Streaming for continuous data processing
- Delta Live Tables for declarative, quality-enforced pipelines
Data Quality
- Delta Live Tables expectations for data validation
- Data profiling and anomaly detection
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using Databricks, PySpark, and Delta Lake
- Build real-time and batch data ingestion pipelines from diverse operational systems using high-performance Kafka data pipelines.
- Implement data transformations that serve digital twin platforms and operational analytics
- Integrate Kafka event streams with Databricks for real-time operational state updates
- Implement data quality checks using Delta Live Tables expectations
- Ensure data governance compliance through Unity Catalog (lineage, access control, metadata)
- Optimize pipeline performance, reliability, and cost efficiency
- Write clean, well-documented, and testable code following engineering best practices
- Collaborate with ML engineers to deliver feature-engineered datasets
- Participate in code reviews, knowledge sharing, and continuous improvement initiatives
- Support production data systems through monitoring, troubleshooting, and incident resolution.
- Build business data warehouse solutions using Terradata for business intelligence.
Preferred Qualifications
- 7+ years of hands-on data engineering experience
- Track record of building and maintaining production-grade data pipelines
- Experience with Delta Live Tables for declarative pipeline development
- Experience working in agile, cross-functional teams
- Familiarity with time-series data patterns and operational data modelling
Highly Desirable
- Experience building data pipelines for digital twin or simulation platforms
- Familiarity with operational state modeling for real-time systems
- Exposure to physics-informed or time-series ML feature engineering
- Experience working with distributed, multidisciplinary teams
- Exposure to industrial domains such as Manufacturing, Logistics, or Transportation is a plus
Location: Hyderabad, Telangana
Department: Engineering
Employment Type: Full-Time
Designation: Campus Admin Officer
Department: Admin
Location: Mahape, Navi Mumbai
Experience: 2–4 Years
Employment Type: Full-Time
Industry Preference: IT Services / Corporate Office / Facility Management
🏢 Job Summary
We are looking for a hands-on and dependable Campus Admin Officer to manage Electrical, IT, and Communication systems for our 120-employee corporate office in Mahape, Navi Mumbai. The candidate will be responsible for troubleshooting electrical faults, maintaining desktops and LAN/WiFi networks, and ensuring smooth day-to-day office operations with minimal downtime.
🔧 Key Responsibilities
Electrical Maintenance
- Diagnose and resolve electrical faults.
- Maintain office lighting, fans, and electrical appliances.
- Troubleshoot generator, UPS, and power supply cabling issues.
IT & Network Management
- Maintain desktops and laptops.
- Troubleshoot hardware, printer, and peripheral issues.
- Manage LAN, structured cabling, routers, switches, and WiFi networks.
- Coordinate with IT/electrical vendors for servicing and repairs.
- Maintain inventory of IT assets and spare equipment.
Intercom & Communication Systems
- Maintain and troubleshoot internal intercom systems.
- Coordinate AMC vendors for preventive maintenance.
✅ Required Skills
- 2–4 years of hands-on experience in Electrical & IT maintenance.
- Basic knowledge of modems, routers, switches, LAN, WiFi.
- Experience in office IT support / desktop support.
- Good communication skills (English/Hindi/Marathi).
- Ability to work independently.
🎯 Preferred Candidate Profile
- Diploma/ITI in Electrical / Electronics / Hardware & Networking (preferred).
- Experience handling office infrastructure for 50+ employees.
- Immediate joiners preferred.
- Candidates located in Navi Mumbai / Thane / Mumbai preferred.
Develop and maintain visually appealing, user-friendly web interfaces using HTML, CSS, JavaScript, and modern front-end frameworks (e.g., React, Angular, Vue.js).
Collaborate with designers and back-end developers to implement and enhance the user experience.
Convert design mockups and wireframes into high-quality, responsive web pages.
Optimize applications for maximum speed, performance, and scalability.
Ensure cross-browser compatibility and responsiveness across various devices.
Write clean, maintainable, and reusable code.
Participate in code reviews to maintain code quality and improve team knowledge.
Work with front-end build tools and version control systems (e.g., Webpack, Git).
Continuously research and stay up to date with the latest trends, best practices, and technologies in front-end development.

We are seeking a Data Engineer ( Snowflake, Bigquery, Redshift) to join our team. In this role, you will be responsible for the development and maintenance of fault-tolerant pipelines, including multiple database systems.
Responsibilities:
- Collaborate with engineering teams to create REST API-based pipelines for large-scale MarTech systems, optimizing for performance and reliability.
- Develop comprehensive data quality testing procedures to ensure the integrity and accuracy of data across all pipelines.
- Build scalable dbt models and configuration files, leveraging best practices for efficient data transformation and analysis.
- Partner with lead data engineers in designing scalable data models.
- Conduct thorough debugging and root cause analysis for complex data pipeline issues, implementing effective solutions and optimizations.
- Follow and adhere to group's standards such as SLAs, code styles, and deployment processes.
- Anticipate breaking changes to implement backwards compatibility strategies regarding API schema changesAssist the team in monitoring pipeline health via observability tools and metrics.
- Participate in refactoring efforts as platform application needs evolve over time.
Requirements:
- Bachelor's degree or higher in Computer Science, Engineering, Mathematics, or a related field.
- 3+ years of professional experience with a cloud database such as Snowflake, Bigquery, Redshift.
- +1 years of professional experience with dbt (cloud or core).
- Exposure to various data processing technologies such as OLAP and OLTP and their applications in real-world scenarios.
- Exposure to work cross-functionally with other teams such as Product, Customer Success, Platform Engineering.
- Familiarity with orchestration tools such as Dagster/Airflow.
- Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT.
- High intermediate to advanced SQL skills (comfort with CTEs, window functions).
- Proficiency with Python and related libraries (e.g., pandas, sqlalchemy, psycopg2) for data manipulation, analysis, and automation.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link:https://zrec.in/e9578?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com
ENGINEER / SR ENGINEER– JAVA/ JEE TECHNOLOGIES
PUNE
|
POSITION |
JAVA/JEE ENGINEER / SR ENGINEER |
|
EXPERIENCE |
4YRS – 12YRS |
|
LOCATION |
PUNE |
|
NOTICE PERIOD |
15- 30 DAY / IMMEDIATE JOINER WORK FROM OFFICE |
EXPERIENCE – 4+ YEARS (4-12 YEARS) EXPERIENCE IN JAVA/ JEE TECHNOLOGIES.
TECHNOLOGY/SKILLS – JAVA1.9, SPRING BOOT, WEB SERVICES, ORACLE, HTTP PROTOCOL, WEB PROXY, JUNIT, XML, JSON
QUALIFICATION – B.E, BTECH, BSC, BCA, BCS OR EQUIVALENT POST-GRADUATION IN ANY BRANCH PREFERABLY IN E&TC, COMPUTERS
OTHER TOOLS USED – ECLIPSE, JIRA, CONFLUENCE, MAVEN, JENKINS, GIT/ BITBUCKET (SOURCE VERSION CONTROL).
SOFT SKILLS – SELF-MOTIVATED, FOCUSED, GOOD TEAM PLAYER, POSITIVE ATTITUDE, GOOD COMMUNICATION SKILLS – WRITTEN AND VERBAL WITH PRODUCT MANAGEMENT, INTERNAL IT AND STAKEHOLDERS.
NICE TO HAVE – KNOWLEDGE OF CRYPTOLOGY, HSM AND KEY MANAGEMENT / PKI / X509, XML SIG /SML ENC, SMART CARD OPERATING SYSTEM, TELECOMMUNICATION (GSM) AND ELECTRONIC PAYMENT SYSTEM, PROVEN KNOWLEDGE TO WORK IN PROCESSES ACCORDING TO CMMI (LEVEL 3).
DETAILS – GOOD TO HAVE ATTRIBUTES:
- KNOWLEDGE OF SOFTWARE ARCHITECTURE, UNDERSTANDING AND PREPARING SEQUENCE DIAGRAMS, ARCHITECTURE, AND PROCESS DOCUMENTATION.
- EXPERIENCE IN ENTERPRISE APPLICATIONS IN THE PAYMENT / BANKING DOMAIN PREFERRED.
- ANALYSIS, DESIGN AND IMPLEMENTATION OF SMART CARD APPLICATIONS FOR THE PAYMENTS SECTOR (MASTERCARD, VISA.)
- TELECOM/ EMBEDDED SYSTEMS OR SMART CARDS BACKGROUND.
- IDENTIFY AREAS FOR IMPROVEMENT TO ACHIEVE - OPERATIONAL EFFICIENCY, CUSTOMER SATISFACTION, AND VALUE ADDITION.
- PREFERABLY NO TRAVEL CONSTRAINTS.
- HIGH / LOW-LEVEL DESIGNING.
- Editorialist YX is looking for a Technical Architect - Search. As part of this role, you will work with a team that builds a unified search platform to power various searches for our .com website,IOS app, and internal support tools. This search impacts thousands of customers in a day and will also become pivotal to our tech efforts as we continue to grow 30x YoY.
- You will own the technical direction for the team, and you will be leading key search projects from ideation all the way to deployment. You will be working closely with both technical and business leaders to fulfill your mission.
- Salary is no bar for the relevant candidate.
QUALIFICATION
- 6+ years of experience working in java and web services.
- 6+ years of experience working in the Search domain.
- Proven skills in designing scalable, highly available distributed systems which can handle high data volumes.
- Strong understanding of software engineering principles and fundamentals including data structures and algorithms.
- Solid understanding of concurrency and multi-threading, multiple design patterns, and debugging and analytical methodologies.
- Hands-on experience on Solr Cloud or ElasticSearch.
- Deep understanding of information retrieval concepts.
- Deep understanding of Linguistic processing like tokenizers, spellers, and stemmers.
- Hands-on experience on big data tech stacks, like Hadoop, Hive, Cassandra, and Spark is a plus.
- Self-directed, self-motivated, and detail-oriented with the ability to come up with good design proposals and thorough analysis of production issues.
- Excellent written and oral communication skills on both technical and non-technical topics.
RESPONSIBILITIES
- Designing and building a search engine using elastic search with engineers in the team for overall success for Search and other ML-based systems.
- Collaborate with peers from other Engineering groups to tackle complex and meaningful problems with efficient and scalable delivery of Search solutions.
- You are expected to be self-motivated, dedicated, and a solution-oriented individual. The main responsibilities for this position include:Leading effort to build large-scale, distributed, and highly available systems and pipelines.
- Leading effort to build large scale and highly available information retrieval systems
- Design and develop solutions using Java tech stack.
- Design and implement as per secure guidelines
- Work with QA to identify issues and fix them.
EDUCATION & EXPERIENCE
- B.Tech. in Computer Science or equivalent experience
- 6+ Yrs of experience in Java, Web services & Search Domain
- Experience working in Product based company
BENEFITS
- Retiral Benefits
- Medical Insurance
- Remote Working Opportunity for the time being
- MacBook
- Stock Options
- Gym Membership
We are looking out for Java Professional for our international client’s project
Please find below Job Description
Experience: 4 to 8 Years
Skills: Java with Spring, Hibernate and Angular JS
Notice: 15 Days to 30 Days Max.
Role and responsibilities:
- JAVA 1.7 & higher. Significant and demonstrable experience of implementing Java best practice, valuing scalability, availability and performance.
- Spring mvc/ Spring boot
- ReST and/or SOAP Web Services - public and private facing APIs
- Good experience with Java Unit Testing Frameworks and Tools such as JUnit, TestNG, Mockito etc.
- JavaScript frameworks (Angular)
- Excellent problem solving, analytical, communication, organization and interpersonal skills
- Analytical thinking - able to simplify complex problems, processes or projects into component parts explore and evaluate them systematically
- Independent thinker with creative, resourceful and proactive problem-solving skills
- Ability to work both independently and as a team player
About our company
NeoQuant Solutions is a Software Application Development and Software Services company that uses cutting-edge technologies in developing software. We provide software solutions and custom products in the domain of the Bank, Finance, and Insurance sector and are known for the use of the latest technology. We believe in innovations and simplicity to solve the business problems of our clients and partners.
Website: https://neoquant.com/">https://neoquant.com/
Aikon Labs Pvt Ltd is a start-up focused on Realizing Ideas. One such idea is iEngage.io , our Intelligent Engagement Platform. We leverage Augmented Intelligence, a combination of machine-driven insights & human understanding, to serve a timely response to every interaction from the people you care about.
Get in touch If you are interested.
Do you have a passion to be a part of an innovative startup? Here’s an opportunity for you - become an active member of our core platform development team.
Main Duties
● Quickly research the latest innovations in Machine Learning, especially with respect to
Natural Language Understanding & implement them if useful
● Train models to provide different insights, mainly from text but also other media such as Audio and Video
● Validate the models trained. Fine-tune & optimise as necessary
● Deploy validated models, wrapped in a Flask server as a REST API or containerize in docker containers
● Build preprocessing pipelines for the models that are bieng served as a REST API
● Periodically, test & validate models in use. Update where necessary
Role & Relationships
We consider ourselves a team & you will be a valuable part of it. You could be reporting to a Senior member or directly to our Founder, CEO
Educational Qualifications
We don’t discriminate. As long as you have the required skill set & the right attitude
Experience
Upto two years of experience, preferably working on ML. Freshers are welcome too!
Skills
Good
● Strong understanding of Java / Python
● Clarity on concepts of Data Science
● A strong grounding in core Machine Learning
● Ability to wrangle & manipulate data into a processable form
● Knowledge of web technologies like Web server (Flask, Django etc), REST API's
Even better
● Experience with deep learning
● Experience with frameworks like Scikit-Learn, Tensorflow, Pytorch, Keras
Competencies
● Knowledge of NLP libraries such as NLTK, spacy, gensim.
● Knowledge of NLP models such as Wod2vec, Glove, ELMO, Fasttext
● An aptitude to solve problems & learn something new
● Highly self-motivated
● Analytical frame of mind
● Ability to work in fast-paced, dynamic environment
Location
Pune
Remuneration
Once we meet, we shall make an offer depending on how good a fit you are & the experience you already have

Below is the JD
Key Responsibilities:
• Understanding of architecture and design across all systems
• Working proficiency in developmental toolsets
• Knowledge of industry wide technology trends and best practices
• Ability to work in large, collaborative teams to achieve organizational goals, and passionate about building an innovative culture
• Proficiency in one or more modern programming languages
• Understanding of software skills such as business analysis, development, maintenance and software improvement
• Understanding of full SDLC
• Work experience in an AGILE development environment
• To research and implement new technologies
• To work under project manager/lead and deploy the projects successfully
• Ability to work to deadlines & as a team member
• Should be able to understand and handle tasks independently
• Ability to work with multiple teams on multiple projects simultaneously
• Troubleshooting through resolution of technical issues and system errors
• Performing Unit testing and documentation
• Delivering project work to defined timescales and deadlines - Proactively reporting on project progress and issues
• Become a mentor and also be a learner amongst a high calibre team
Skills:
• Minimum 6+ years of Full stack software engineering
• Core Java / Design Patterns: Java 8 +, Multi-threading, Exception Handling, Defensive
Programming, Microservices, Java Design Patterns
• Software Frameworks: Spring Core Framework, Spring Boot, Spring Integration,
Apache Camel, Log4j 2.x / Logback
• Very good understanding of RDBMS & NoSQL databases likes Oracle, MySQL,
Postgres, MongoDB/Cassandra
• Strong hands on experience in Apache Camel
• Testing Frameworks: JUnit, Powermock / Mockito, Cucumber,
• Good knowledge of servers like Apache Tomcat, JBOSS, Wildfly.
• Messaging Protocols: Kafka, RabbitMQ, Active MQ, RESTful HTTP
• Database: DB2, Oracle, MySQL
• Distributed Cache: Hazelcast, Redis, Memcached, Gemfire
• AWS/Private Cloud, Docker/Kubernetes will be a plus.
Role Summary/Purpose:
We are looking for a Developer/Senior Developers to be a part of building advanced analytical platform leveraging Big Data technologies and transform the legacy systems. This role is an exciting, fast-paced, constantly changing and challenging work environment, and will play an important role in resolving and influencing high-level decisions.
Requirements:
- The candidate must be a self-starter, who can work under general guidelines in a fast-spaced environment.
- Overall minimum of 4 to 8 year of software development experience and 2 years in Data Warehousing domain knowledge
- Must have 3 years of hands-on working knowledge on Big Data technologies such as Hadoop, Hive, Hbase, Spark, Kafka, Spark Streaming, SCALA etc…
- Excellent knowledge in SQL & Linux Shell scripting
- Bachelors/Master’s/Engineering Degree from a well-reputed university.
- Strong communication, Interpersonal, Learning and organizing skills matched with the ability to manage stress, Time, and People effectively
- Proven experience in co-ordination of many dependencies and multiple demanding stakeholders in a complex, large-scale deployment environment
- Ability to manage a diverse and challenging stakeholder community
- Diverse knowledge and experience of working on Agile Deliveries and Scrum teams.
Responsibilities
- Should works as a senior developer/individual contributor based on situations
- Should be part of SCRUM discussions and to take requirements
- Adhere to SCRUM timeline and deliver accordingly
- Participate in a team environment for the design, development and implementation
- Should take L3 activities on need basis
- Prepare Unit/SIT/UAT testcase and log the results
- Co-ordinate SIT and UAT Testing. Take feedbacks and provide necessary remediation/recommendation in time.
- Quality delivery and automation should be a top priority
- Co-ordinate change and deployment in time
- Should create healthy harmony within the team
- Owns interaction points with members of core team (e.g.BA team, Testing and business team) and any other relevant stakeholders









