
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes

Similar jobs
- Overall Synopsis (Skills to focus)
- Brush up your skills in Photoshop & Illustrator
- Illustrator tools - Pen tool, Gradient, Pathfinder, Blending modes, type on a path, rotate and reflect, mesh tool, selection tool, etc.
- Photoshop tools – change the color of the background, remove a selection, pen tool etc.
- Make sure to revise to recreate curves, and nodes smoothly while doing vectorization.
- Try making some logo designs, and work on the curves & texture.
This is FULL TIME WFH NIGHT SHIFT role.
If interested kindly share your updated cv at 82008 31681
About the Role:
We are looking for a talented UI/UX Designer to join our team and play a key role in shaping the user experience of Hector, our AdTech platform. The ideal candidate will have experience in designing intuitive web applications and working with large data sets, ensuring a seamless and visually appealing user interface.
Responsibilities:
- Design user-centric web applications that enhance usability and engagement.
- Work closely with our CEO, developers, and other stakeholders to understand user needs and translate them into intuitive interfaces.
- Create wireframes, prototypes, and high-fidelity designs for new features and improvements.
- Optimize data-heavy interfaces to ensure clarity, usability, and performance.
- Conduct user research, and usability testing, and iterate on feedback.
- Maintain design consistency and ensure adherence to design guidelines and best practices.
Requirements:
- 2-3 years of experience in UI/UX design, preferably in SaaS or data-intensive applications.
- Strong portfolio showcasing web application designs and ability to handle large data sets.
- Proficiency in design tools such as Figma.
- Understanding of responsive design, usability principles, and accessibility standards.
- Experience working with developers and familiarity with front-end technologies (HTML, CSS, JavaScript) is a plus.
- Knowledge of AdTech or advertising platforms is a bonus.
If you're passionate about designing data-driven applications and want to work on an impactful AdTech product, we'd love to hear from you!


XressBees – a logistics company started in 2015 – is amongst the fastest growing companies of its sector. Our
vision to evolve into a strong full-service logistics organization reflects itself in the various lines of business like B2C
logistics 3PL, B2B Xpress, Hyperlocal and Cross border Logistics.
Our strong domain expertise and constant focus on innovation has helped us rapidly evolve as the most trusted
logistics partner of India. XB has progressively carved our way towards best-in-class technology platforms, an
extensive logistics network reach, and a seamless last mile management system.
While on this aggressive growth path, we seek to become the one-stop-shop for end-to-end logistics solutions. Our
big focus areas for the very near future include strengthening our presence as service providers of choice and
leveraging the power of technology to drive supply chain efficiencies.
Job Overview
XpressBees would enrich and scale its end-to-end logistics solutions at a high pace. This is a great opportunity to join
the team working on forming and delivering the operational strategy behind Artificial Intelligence / Machine Learning
and Data Engineering, leading projects and teams of AI Engineers collaborating with Data Scientists. In your role, you
will build high performance AI/ML solutions using groundbreaking AI/ML and BigData technologies. You will need to
understand business requirements and convert them to a solvable data science problem statement. You will be
involved in end to end AI/ML projects, starting from smaller scale POCs all the way to full scale ML pipelines in
production.
Seasoned AI/ML Engineers would own the implementation and productionzation of cutting-edge AI driven algorithmic
components for search, recommendation and insights to improve the efficiencies of the logistics supply chain and
serve the customer better.
You will apply innovative ML tools and concepts to deliver value to our teams and customers and make an impact to
the organization while solving challenging problems in the areas of AI, ML , Data Analytics and Computer Science.
Opportunities for application:
- Route Optimization
- Address / Geo-Coding Engine
- Anomaly detection, Computer Vision (e.g. loading / unloading)
- Fraud Detection (fake delivery attempts)
- Promise Recommendation Engine etc.
- Customer & Tech support solutions, e.g. chat bots.
- Breach detection / prediction
An Artificial Intelligence Engineer would apply himself/herself in the areas of -
- Deep Learning, NLP, Reinforcement Learning
- Machine Learning - Logistic Regression, Decision Trees, Random Forests, XGBoost, etc..
- Driving Optimization via LPs, MILPs, Stochastic Programs, and MDPs
- Operations Research, Supply Chain Optimization, and Data Analytics/Visualization
- Computer Vision and OCR technologies
The AI Engineering team enables internal teams to add AI capabilities to their Apps and Workflows easily via APIs
without needing to build AI expertise in each team – Decision Support, NLP, Computer Vision, for Public Clouds and
Enterprise in NLU, Vision and Conversational AI.Candidate is adept at working with large data sets to find
opportunities for product and process optimization and using models to test the effectiveness of different courses of
action. They must have knowledge using a variety of data mining/data analysis methods, using a variety of data tools,
building, and implementing models, using/creating algorithms, and creating/running simulations. They must be
comfortable working with a wide range of stakeholders and functional teams. The right candidate will have a passion
for discovering solutions hidden in large data sets and working with stakeholders to improve business outcomes.
Roles & Responsibilities
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Building cloud services in Decision Support (Anomaly Detection, Time series forecasting, Fraud detection,
Risk prevention, Predictive analytics), computer vision, natural language processing (NLP) and speech that
work out of the box.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Build core of Artificial Intelligence and AI Services such as Decision Support, Vision, Speech, Text, NLP, NLU,
and others.
● Leverage Cloud technology –AWS, GCP, Azure
● Experiment with ML models in Python using machine learning libraries (Pytorch, Tensorflow), Big Data,
Hadoop, HBase, Spark, etc
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to
drive business solutions.
● Mine and analyze data from company databases to drive optimization and improvement of product
development, marketing techniques and business strategies.
● Assess the effectiveness and accuracy of new data sources and data gathering techniques.
● Develop custom data models and algorithms to apply to data sets.
● Use predictive modeling to increase and optimize customer experiences, supply chain metric and other
business outcomes.
● Develop company A/B testing framework and test model quality.
● Coordinate with different functional teams to implement models and monitor outcomes.
● Develop processes and tools to monitor and analyze model performance and data accuracy.
● Develop scalable infrastructure, including microservices and backend, that automates training and
deployment of ML models.
● Brainstorm and Design various POCs using ML/DL/NLP solutions for new or existing enterprise problems.
● Work with fellow data scientists/SW engineers to build out other parts of the infrastructure, effectively
communicating your needs and understanding theirs and address external and internal shareholder's
product challenges.
● Deliver machine learning and data science projects with data science techniques and associated libraries
such as AI/ ML or equivalent NLP (Natural Language Processing) packages. Such techniques include a good
to phenomenal understanding of statistical models, probabilistic algorithms, classification, clustering, deep
learning or related approaches as it applies to financial applications.
● The role will encourage you to learn a wide array of capabilities, toolsets and architectural patterns for
successful delivery.
What is required of you?
You will get an opportunity to build and operate a suite of massive scale, integrated data/ML platforms in a broadly
distributed, multi-tenant cloud environment.
● B.S., M.S., or Ph.D. in Computer Science, Computer Engineering
● Coding knowledge and experience with several languages: C, C++, Java,JavaScript, etc.
● Experience with building high-performance, resilient, scalable, and well-engineered systems
● Experience in CI/CD and development best practices, instrumentation, logging systems
● Experience using statistical computer languages (R, Python, SLQ, etc.) to manipulate data and draw insights
from large data sets.
● Experience working with and creating data architectures.
● Good understanding of various machine learning and natural language processing technologies, such as
classification, information retrieval, clustering, knowledge graph, semi-supervised learning and ranking.
● Knowledge and experience in statistical and data mining techniques: GLM/Regression, Random Forest,
Boosting, Trees, text mining, social network analysis, etc.
● Knowledge on using web services: Redshift, S3, Spark, Digital Ocean, etc.
● Knowledge on creating and using advanced machine learning algorithms and statistics: regression,
simulation, scenario analysis, modeling, clustering, decision trees, neural networks, etc.
● Knowledge on analyzing data from 3rd party providers: Google Analytics, Site Catalyst, Core metrics,
AdWords, Crimson Hexagon, Facebook Insights, etc.
● Knowledge on distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark, MySQL, Kafka etc.
● Knowledge on visualizing/presenting data for stakeholders using: Quicksight, Periscope, Business Objects,
D3, ggplot, Tableau etc.
● Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural
networks, etc.) and their real-world advantages/drawbacks.
● Knowledge of advanced statistical techniques and concepts (regression, properties of distributions,
statistical tests, and proper usage, etc.) and experience with applications.
● Experience building data pipelines that prep data for Machine learning and complete feedback loops.
● Knowledge of Machine Learning lifecycle and experience working with data scientists
● Experience with Relational databases and NoSQL databases
● Experience with workflow scheduling / orchestration such as Airflow or Oozie
● Working knowledge of current techniques and approaches in machine learning and statistical or
mathematical models
● Strong Data Engineering & ETL skills to build scalable data pipelines. Exposure to data streaming stack (e.g.
Kafka)
● Relevant experience in fine tuning and optimizing ML (especially Deep Learning) models to bring down
serving latency.
● Exposure to ML model productionzation stack (e.g. MLFlow, Docker)
● Excellent exploratory data analysis skills to slice & dice data at scale using SQL in Redshift/BigQuery.


- Create elegant, efficient and maintainable software to support and extend our current products.
- Solve complex architectural challenges when implementing new features.
- Integrate with databases, file systems, cloud services when delivering scalable solutions.
- Troubleshoot and fix reported customer issues, delivering software patches as needed.
- Assisting in making design decisions based on performance, scalability, security, and future expansion.
- Communicating and collaborating with management, product, QA teams.
Job Location : Gurgaon (Sector 15)
What will help you thrive in this role?
- Degree in Computer Science, similar technical field of study or equivalent practical experience.
- Proficiency in web development using any of the languages: PHP, NodeJS or Golang.
- Should possess a solid grasp of object-oriented programming, system architecture & design, databases, development, and testing methodologies.
- Good in Design (HLD and LLD).
- Good in Database schema design.
- Good to have experience with AWS or Google Cloud or Azure and serverless architecture.
- Excellent verbal communication skills.
- Track record of delivering successful products as an engineer.
- Experience with large applications of some scale will be a plus.
- Ability to deep dive, understand & improve the legacy code.
- Domain knowledge of supply chain & retail industry is a plus.


- Good Experience in C# and SQL Server.
- Strong knowledge of .NET web framework includes ASP.NET, ASP.NET MVC, .NET Core , Web API, ASPX, LINQ ,WCF, HTML, JavaScript, jQuery, AJAX and CSS.
- Experience with user interface design and prototyping.
- Experience in designing, developing, and deploying Web Service apps or Azure-hosted equivalents.
- Understanding of object-oriented and service-oriented application development techniques and theories.
- Experience with debugging, performance profiling and optimization.
- Experience with source control management systems and deployment environment.
- Familiarity with writing unit tests and micro services.
- Ability to take a project from start to finish with or without supervision.
- Azure Technologies Desired.
- Strong analytical and communication skills with both internal team members and external business stakeholders.



We are Still Hiring!!!
Dear Candidate,
This email is regarding open positions for Data Engineer Professionals with our organisation CRMNext.
In case, you find the company profile and JD matching your aspirations and your profile matches the required Skill and qualifications criteria, please share your updated resume with response to questions.
We shall reach you back for scheduling the interviews post this.
About Company:
Driven by a Passion for Excellence
Acidaes Solutions Pvt. Ltd. is a fast growing specialist Customer Relationship Management (CRM) product IT company providing ultra-scalable CRM solutions. It offers CRMNEXT, our flagship and award winning CRM platform to leading enterprises both on cloud as well as on-premise models. We consistently focus on using the state of art technology solutions to provide leading product capabilities to our customers.
CRMNEXT is a global cloud CRM solution provider credited with the world's largest installation ever. From Fortune 500 to start-ups, businesses across nine verticals have built profitable customer relationships via CRMNEXT. A pioneer of Digital CRM for some of the largest enterprises across Asia-Pacific, CRMNEXT's customers include global brands like Pfizer, HDFC Bank, ICICI Bank, Axis Bank, Tata AIA, Reliance, National Bank of Oman, Pavers England etc. It was recently lauded in the Gartner Magic Quadrant 2015 for Lead management, Sales Force Automation and Customer Engagement. For more information, visit us at www.crmnext.com
Educational Qualification:
B.E./B.Tech /M.E./ M.Tech/ MCA with (Bsc.IT/Bsc. Comp/BCA is mandatory)
60% in Xth, XIIth /diploma, B.E./B.Tech/M.E/M.Tech/ MCA with (Bsc.IT/Bsc. Comp/BCA is mandatory)
All education should be regular (Please Note - Degrees through Distance learning/correspondence will not consider)
Exp level- 2 to 5 yrs
Location-Andheri (Mumbai)
Technical expertise required:
1)Analytics experience in the BFSI domain is must
2) Hands on technical experience in python, big data and AI
3) Understanding of datamodels and analytical concepts
4) Client engagement :
Should have run in past client engagements for Big data/ AI projects starting from requirement gathering, to planning development sprints, and delivery
Should have experience in deploying big data and AI projects
First hand experience on data governance, data quality, customer data models, industry data models
Aware of SDLC.
Regards,
Deepak Sharma
HR Team



Fullstack Developer
Features:
Expertise in one/more programming languages preferably
python, Django framework..(Mandate)
Very good in Database Management Systems [ SQL, NoSql ]
Experience in html, css, javascript using React.js.(Mandate)
Preferred technology stack.
Languages: Python, Javascript, reactJs
Frameworks: Django,
Databases: Postgresql, Mysql, Sqlite


Do you thrive on working with cutting edge technology, with innovators in the early stages of ideas, products, or platforms? Do you have the passion to be a core member of a fast-growing start-up? Are you an expert or aspiring to become one who can work on the Full Stack of an application? Do you have a genuine interest in all software technologies? If the answer is yes, do reach out to us - Crediwatch is the place for you!
You will be exposed to work on every level of the stack in a highly agile, fast growing, start-up FinTech environment, while ensuring Breakthrough Simplicity in innovation and Diving Deep to arrive at a solution-based approach to problem solving and idea creation.
The environment at Crediwatch is vibrant and innovative! You will learn and regularly interact with peers who are the best at what they do and will motivate you to be the best version of yourself, technically and professionally.
The Role Expectation
- You will play a key role in the development of the core product, working directly with the Business team on realizing their needs and translating it into the product.
- You will be involved in the overall design, architecture and development of the application, maintaining quality and ensuring performance and compliance to software standards and guidelines
- You will adhere to the best practises of the Agile & TDD
- You will collaborate with the rest of the engineering team to design, prioritise and launch new features.
- You will take ownership for organising code and maintain its integrity at all points in time.
- You will be responsible for understanding and implementing security and data protection best practices.
- You will bring in a passion for technology and hunger to learn
You Have
- A sound knowledge of Python, hands-on experience in using Django/Flask etc., design patterns and application design
- Experience with database architectures like NoSQL, RDBMS. Eg: MongoDB/ Cassandra / Couchbase / MySQL
- A good understanding of message queues mechanisms like Redis/RabitMQ (or similar)
- Knowledge of python web crawling frameworks like Scrapy & Frontera
- Hands on experience with CI/CD
- Experience in using Git for source code management & Jira
Good to Have
- Strong Linux skills, knowledge of building large scale, multi-threaded applications and experience in designing and building RESTful web services,
- Building API services with GraphQL
- Skills in configuring and implementing Hadoop/HDFS, Elasticsearch and HBase or knowledge of any graph databases
- Prior experience in architecting large scale distributed systems and experience with cloud deployments on AWS/Azure
- Prior experience working in a fast-paced start-up environment.
- Domain knowledge in the financial/data space; any external, relevant certifications.
Your Background
- At least 6+ years of hands-on development experience
- A bachelor’s or master’s degree in a related field
You Believe-in & will align with our Business Objectives
- Customer Obsession - Consistently listen to customers; test, enhance and improve the customer experience.
- Breakthrough Simplicity - An innovative approach to make everything simpler
- Diving Deep -Technique used to arrive at a solution-based approach to problem solving and idea creation
- Drive for Results - Focus on end result of any task
- Encourage and Improve - Encouraging and promoting team work and focus on continuous self-development at every stage.
- Be Ethical and Respectful - Willingness to do the right thing – even if it is hard; courteousness and being focused on the best in others.
Who We Are
We build innovative technology everyday!
Crediwatch is a ‘Data Insights-as-a-service’ company that provides lenders, businesses with actionable credit intelligence on private entities they need to improve trust and increase their lending and trading activity. Crediwatch does this with no human intervention by deploying the latest practical AI and technology tools that provide the most reliable comprehensive real time inputs.
Each day at Crediwatch is about striving for transparent insights, analysis and accurate results. If this aligns with your interests and aspirations, we have interesting positions for you.
You Will Enjoy
Our start-up environment - fun, casual, informal, family & pet-friendly! Ours is a highly energized playground where brilliant minds come together to make bold, impactful decisions every day! Needless to say, we have excellent filter coffee, health drinks round the clock, lunch buffets, PS4 and Foosball breaks and a stocked kitchen.
We play to win and have fun doing it! We work to engage your brain by organizing brilliant TechTalks by industry leaders and frequent high-on-energy hackathons and engage your crazy fun-side at our well-planned retreats. We are a highly eco-conscious team and we encourage and support our team’s physical & mental wellbeing.
All these and a great set of people to work with - We Are Crediwatch!

The candidate will need to demonstrate previous experience of taking a client's requirements and translating them into technical specifications. The candidate will have to have previous examples of how he/she has worked with internal customers and technical peers.
The ideal candidate should have 3 to 5 years’ experience within a web application development environment.
Experience with PHP, HTML, CSS, MySQL, SOAP/REST, JavaScript, AJAX, etc. is a must. Following are the key area we are expecting you to have strong command on:
- Analytical Skills
- Algorithms
Responsibilities and Duties
- Development of software extensions to vTiger CRM/Drupal/WordPress and any other PHP based application
- Development of integration between CRM and ERP systems provided by Target Integration and any other system using their APIs
- Development and maintenance of code in Gitlab (In-house git server)
- Managing tickets and updating them regularly in Redmine (Project and Ticket Management Platform)
- Providing solutions to the pre-sales team on new projects
- Supporting the existing development team
Qualifications and Skills
- Coming up with ideas to help grow the team
- Data Structure
- Data Modelling Concepts
- Object-Oriented Design Principles and Concepts
- MVC – Pattern in Depth.
- Aware of Various other commonly used design patterns like Factory, Abstract Factory, etc.
- PHP, MYSQL, HTML, CSS, JS, JSON, Webservices
- Object-Oriented PHP
- SQL In general, Procedure, triggers, functions, etc.
- Query Designing.
- Thorough understanding of HTTP.
- Version Control (specifical git) and bug tracking software knowledge are important.
Benefits
- Linux system administration experience will be useful
- Experience of working on at least two of Joomla/Drupal/vTiger CRM or Sugar CRM is a must
- Strong understanding of relational database design
- Experience in creating professional-looking web interfaces using CSS and HTML
- Understanding of business processes
- Ideally, they should have worked on CRM/ERP/Accounting applications
- Excellent written and spoken communication skills
Job Type: Full-time

