3+ Graph Databases Jobs in India
Apply to 3+ Graph Databases Jobs on CutShort.io. Find your next job, effortlessly. Browse Graph Databases Jobs and apply today!


About Us
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.
Position Overview:
Seeking an experienced Data Engineer to design, develop, and productionize graph database solutions using Neo4j for economic data analysis and modeling. This role requires expertise in graph database architecture, data pipeline development, and production system deployment.
Key Responsibilities
Graph Database Development
- Design and implement Neo4j graph database schemas for complex economic datasets
- Develop efficient graph data models representing economic relationships, transactions, and market dynamics
- Create and optimize Cypher queries for complex analytical workloads
- Build graph-based data pipelines for real-time and batch processing
Data Engineering & Pipeline Development
- Architect scalable data ingestion frameworks for structured and unstructured economic data
- Develop ETL/ELT processes to transform relational and time-series data into graph formats
- Implement data validation, quality checks, and monitoring systems
- Build APIs and services for graph data access and manipulation
Production Systems & Operations
- Deploy and maintain Neo4j clusters in production environments
- Implement backup, disaster recovery, and high availability solutions
- Monitor database performance, optimize queries, and manage capacity planning
- Establish CI/CD pipelines for graph database deployments
Economic Data Specialization
- Model financial market relationships, economic indicators, and trading networks
- Create graph representations of supply chains, market structures, and economic flows
- Develop graph analytics for fraud detection, risk assessment, and market analysis
- Collaborate with economists and analysts to translate business requirements into graph solutions
Required Qualifications
Technical Skills:
- **Neo4j Expertise**: 3+ years hands-on experience with Neo4j database development
- **Graph Modeling**: Strong understanding of graph theory and data modeling principles
- **Cypher Query Language**: Advanced proficiency in writing complex Cypher queries
- **Programming**: Python, Java, or Scala for data processing and application development
- **Data Pipeline Tools**: Experience with Apache Kafka, Apache Spark, or similar frameworks
- **Cloud Platforms**: AWS, GCP, or Azure with containerization (Docker, Kubernetes)
Database & Infrastructure
- Experience with graph database administration and performance tuning
- Knowledge of distributed systems and database clustering
- Understanding of data warehousing concepts and dimensional modeling
- Familiarity with other databases (PostgreSQL, MongoDB, Elasticsearch)
Economic Data Experience
- Experience working with financial datasets, market data, or economic indicators
- Understanding of financial data structures and regulatory requirements
- Knowledge of data governance and compliance in financial services
Preferred Qualifications
- **Neo4j Certification**: Neo4j Certified Professional or Graph Data Science certification
- **Advanced Degree**: Master's in Computer Science, Economics, or related field
- **Industry Experience**: 5+ years in financial services, fintech, or economic research
- **Additional Skills**: Machine learning on graphs, network analysis, time-series analysis
Technical Environment
- Neo4j Enterprise Edition with APOC procedures
- Apache Kafka for streaming data ingestion
- Apache Spark for large-scale data processing
- Docker and Kubernetes for containerized deployments
- Git, Jenkins/GitLab CI for version control and deployment
- Monitoring tools: Prometheus, Grafana, ELK stack
Application Requirements
- Portfolio demonstrating Neo4j graph database projects
- Examples of production graph systems you've built
- Experience with economic or financial data modeling preferred

Role Overview
We are looking for a Tech Lead with a strong background in fintech, especially with experience or a strong interest in fraud prevention and Anti-Money Laundering (AML) technologies.
This role is critical in leading our fintech product development, ensuring the integration of robust security measures, and guiding our team in Hyderabad towards delivering high-quality, secure, and compliant software solutions.
Responsibilities
- Lead the development of fintech solutions, focusing on fraud prevention and AML, using Typescript, ReactJs, Python, and SQL databases.
- Architect and deploy secure, scalable applications on AWS or Azure, adhering to the best practices in financial security and data protection.
- Design and manage databases with an emphasis on security, integrity, and performance, ensuring compliance with fintech regulatory standards.
- Guide and mentor the development team, promoting a culture of excellence, innovation, and continuous learning in the fintech space.
- Collaborate with stakeholders across the company, including product management, design, and QA, to ensure project alignment with business goals and regulatory requirements.
- Keep abreast of the latest trends and technologies in fintech, fraud prevention, and AML, applying this knowledge to drive the company's objectives.
Requirements
- 5-7 years of experience in software development, with a focus on fintech solutions and a strong understanding of fraud prevention and AML strategies.
- Expertise in Typescript, ReactJs, and familiarity with Python.
- Proven experience with SQL databases and cloud services (AWS or Azure), with certifications in these areas being a plus.
- Demonstrated ability to design and implement secure, high-performance software architectures in the fintech domain.
- Exceptional leadership and communication skills, with the ability to inspire and lead a team towards achieving excellence.
- A bachelor's degree in Computer Science, Engineering, or a related field, with additional certifications in fintech, security, or compliance being highly regarded.
Why Join Us?
- Opportunity to be at the cutting edge of fintech innovation, particularly in fraud prevention and AML.
- Contribute to a company with ambitious goals to revolutionize software development and make a historical impact.
- Be part of a visionary team dedicated to creating a lasting legacy in the tech industry.
- Work in an environment that values innovation, leadership, and the long-term success of its employees.
Role and Responsibilities
The candidate for the role will be responsible for enabling single view for the data from multiple sources.
- Work on creating data pipelines to graph database from data lake
- Design graph database
- Write Graph Database queries for front end team to use for visualization
- Enable machine learning algorithms on graph databases
- Guide and enable junior team members
Qualifications and Education Requirements
B.Tech with 2-7 years of experience
Preferred Skills
Must Have
Hands-on exposure to Graph Databases like Neo4J, Janus etc..
- Hands-on exposure to programming and scripting language like Python and PySpark
- Knowledge of working on cloud platforms like GCP, AWS etc.
- Knowledge of Graph Query languages like CQL, Gremlin etc.
- Knowledge and experience of Machine Learning
Good to Have
- Knowledge of working on Hadoop environment
- Knowledge of graph algorithms
- Ability to work on tight deadlines