Cutshort logo
CLOUDSUFI
CLOUDSUFI cover picture
Founded :
2019
Type :
Products & Services
Size :
100-1000
Stage :
Profitable

About

We exist to eliminate the gap between “Human Intuition” and “Data-Backed Decisions”


Data is the new oxygen, and we believe no organization can live without it. We partner with our customers to get to the core of their problems, enable the data supply chain and help them monetize their data. We make enterprise data dance!


Our work elevates the quality of lives for our family, customers, partners and the community.


The human values that we display in all our interactions are of:


Passion – we are committed in heart and head

Integrity – we are real, honest and, fair

Empathy – we understand business isn’t just B2B, or B2C, it is H2H i.e. Human to Human

Boldness – we have the courage to think and do differently


The CLOUDSUFI Foundation embraces the power of legacy and wisdom of those who have helped laid the foundation for all of us, our seniors. We believe in their abilities and we pledge to equip them, to provide them jobs, and to bring them sufi joy.

Read more

Tech stack

skill iconGo Programming (Golang)
skill iconJava
skill iconMachine Learning (ML)
skill iconData Analytics
skill iconReact.js
skill iconReact Native
Information security
Cyber Security
Data engineering
SAP

Connect with the team

Profile picture
Harmeet Singh
Profile picture
Ayushi Dwivedi
Profile picture
Lishta Jain

Company social profiles

instagramlinkedintwitterfacebook

Jobs at CLOUDSUFI

CLOUDSUFI
at CLOUDSUFI
3 recruiters
Ayushi Dwivedi
Posted by Ayushi Dwivedi
Remote only
3 - 5 yrs
₹15L - ₹25L / yr
Google Cloud Platform (GCP)
skill iconPython
SQL

If interested please share your resume at ayushi.dwivedi at cloudsufi.com


Note - This role is remote but with quarterly visit to Noida office (1 week in a qarter) if you are ok for that then pls share your resume.


Data Engineer 

Position Type: Full-time


About Us

CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.


Job Summary

We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.


Key Responsibilities

ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.

Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.

Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.

Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards. 

API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.

Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.

Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.

Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.

Qualifications and Skills

Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.

Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.

Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.

Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.


Core Competencies:

Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)

Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)

Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling

Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).

Experience with data validation techniques and tools.

Familiarity with CI/CD practices and the ability to work in an Agile framework.

Strong problem-solving skills and keen attention to detail.


Preferred Qualifications:

Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).

Familiarity with similar large-scale public dataset integration initiatives.

Experience with multilingual data integration.

Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Ayushi Dwivedi
Posted by Ayushi Dwivedi
Remote only
6 - 10 yrs
₹25L - ₹38L / yr
Google Cloud Platform (GCP)
SQL
skill iconPython
Bigquery

If interested please send your resume at ayushi.dwivedi at cloudsufi.com


Current location of candidate must be Bangalore (as client office visit is required), also candidate must be open for 1 week in a quarter visit to Noida office.


About Us

CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.

 

Our Values 

We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.

 

Equal Opportunity Statement 

CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/


Job Summary

We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.


Key Responsibilities

ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.

Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.

Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.

Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards. 

API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.

Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.

Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.

Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.

Qualifications and Skills

Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.

Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.

Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.

Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.


Core Competencies:

Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)

Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)

Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling

Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).

Experience with data validation techniques and tools.

Familiarity with CI/CD practices and the ability to work in an Agile framework.

Strong problem-solving skills and keen attention to detail.


Preferred Qualifications:

Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).

Familiarity with similar large-scale public dataset integration initiatives.

Experience with multilingual data integration.

Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
Noida
8 - 14 yrs
₹25L - ₹40L / yr
skill iconJava
Google Cloud Platform (GCP)
Spring
RESTful APIs
Microservices
+1 more

About Us :


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values :


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement :


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.


About the Role


Job Title: Lead Java Developer


Location: Noida(Hybrid)


Experience: 7-12 years


Education: BTech / BE / ME /MTech/ MCA / MSc Computer Science



Primary Skills - Java 8-17+, Core Java, Design patterns (more than Singleton & Factory), Webservices development,REST/SOAP, XML & JSON manipulation, OAuth 2.0, CI/CD, SQL / NoSQL


Secondary Skills -Kafka, Jenkins, Kubernetes, Google Cloud Platform (GCP), SAP JCo library, Terraform


Certifications (Optional): OCPJP (the Oracle Certified Professional Java Programmer) / Google Professional Cloud



Required Experience:


● Must have integration component development experience using Java 8/9 technologies andservice-oriented architecture (SOA)


● Must have in-depth knowledge of design patterns and integration architecture


● Must have experience in system scalability and maintenance for complex enterprise applications and integration solutions


● Experience with developing solutions on Google Cloud Platform will be an added advantage.


● Should have good hands-on experience with Software Engineering tools viz. Eclipse, NetBeans, JIRA,Confluence, BitBucket, SVN etc.


● Should be very well verse with current technology trends in IT Solutions e.g. Cloud Platform Development,DevOps, Low Code solutions, Intelligent Automation


Good to Have:


● Experience of developing 3-4 integration adapters/connectors for enterprise applications (ERP, CRM, HCM,SCM, Billing etc.) using industry standard frameworks and methodologies following Agile/Scrum


Behavioral competencies required:


● Must have worked with US/Europe based clients in onsite/offshore delivery model


● Should have very good verbal and written communication, technical articulation, listening and presentation skills


● Should have proven analytical and problem solving skills


● Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills


● Should be a quick learner and team player


● Should have experience of working under stringent deadlines in a Matrix organization structure


● Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in past organizations


Job Responsibilities:


● Writing the design specifications and user stories for the functionalities assigned.


● Develop assigned components / classes and assist QA team in writing the test cases


● Create and maintain coding best practices and do peer code / solution reviews


● Participate in Daily Scrum calls, Scrum Planning, Retro and Demos meetings


● Bring out technical/design/architectural challenges/risks during execution, develop action plan for mitigation and aversion of identified risks


● Comply with development processes, documentation templates and tools prescribed by CloudSufi or and its clients


● Work with other teams and Architects in the organization and assist them on technical Issues/Demos/POCs and proposal writing for prospective clients


● Contribute towards the creation of knowledge repository, reusable assets/solution accelerators and IPs


● Provide feedback to junior developers and be a coach and mentor for them


● Provide training sessions on the latest technologies and topics to others employees in the organization


● Participate in organization development activities time to time - Interviews, CSR/Employee engagement activities, participation in business events/conferences, implementation of new policies, systems and procedures as decided by Management team

Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Noida
8 - 15 yrs
₹30L - ₹35L / yr
skill iconJava
Project Management
Design patterns
OAuth
Technical Architecture

About Us :


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values :


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement :


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.


Role: Project Manager

Location: Noida, Delhi NCR

Experience: 8-15 years

Education: BTech / BE / MCA / MSc Computer Science



Primary Skills - 


Java 8-17+, Core Java, Design patterns (more than Singleton & Factory), Multi-tenant SaaS platforms, Event-driven systems, Scalable Cloud-native architecture, Webservices development REST/SOAP, XML & JSON manipulation, OAuth 2.0, CI/CD, SQL / NoSQL, Program Strategy and execution, Technical & Architectural Leadership


Secondary Skills -Kafka, Jenkins, Kubernetes, Google Cloud Platform (GCP)

Certifications (Optional): OCPJP (the Oracle Certified Professional Java Programmer) / Google Professional Cloud Developer


Required Experience:


  • Must have experience in creating Program Strategy and delivery milestones for complex integration solutions on Cloud
  • Serve as the technical liaison between solution architects and delivery teams, ensuring that low-level designs (LLD) align with the high-level architectural vision (HLD)
  • Must have integration component development experience using Java 8/9 technologies and service-oriented architecture (SOA)
  • Must have in-depth knowledge of design patterns and integration architecture
  • Must have experience in Cloud-native solutions, Domain-driven design, Secure application architecture
  • Must have experience in system scalability and maintenance for complex enterprise applications and integration solutions
  • API security, API gateways, OAuth 2.0
  • Engineering roadmaps, Architecture governance, SDLC and DevOps strategy
  • Experience with developing solutions on Google Cloud Platform will be an added advantage.
  • Should have good hands-on experience with Software Engineering tools viz. Eclipse, NetBeans, JIRA, Confluence, BitBucket, SVN etc.
  • Should be very well verse with current technology trends in IT Solutions e.g. Cloud Platform Development, DevOps, Low Code solutions, Intelligent Automation


Good to Have:


  • Experience of developing 3-4 integration adapters/connectors for enterprise applications (ERP, CRM, HCM, SCM, Billing etc.) using industry standard frameworks and methodologies following Agile/Scrum
  • Job Responsibilities
  • Writing the design specifications and user stories for the functionalities assigned.
  • Create and maintain coding best practices and do peer code / solution reviews
  • Run Daily Scrum calls, Scrum Planning, Retro and Demos meetings
  • Bring out technical/design/architectural challenges/risks during execution, develop action plan for mitigation and aversion of identified risks
  • Monitor compliance with development processes, documentation templates and tools prescribed by CloudSufi or and its clients
  • Work with other teams and Architects in the organization and assist them on technical Issues/Demos/POCs and proposal writing for prospective clients
  • Contribute towards the creation of knowledge repository, reusable assets/solution accelerators and IPs
  • Provide feedback to junior developers and be a coach and mentor for them
  • Provide training sessions on the latest technologies and topics to others employees in the organization
  • Participate in organization development activities time to time - Interviews, CSR/Employee engagement activities, participation in business events/conferences, implementation of new policies, systems and procedures as decided by Management team

Behavioural competencies required:


  • Must have worked with US/Europe based clients in onsite/offshore delivery model
  • Should have very good verbal and written communication, technical articulation, listening and presentation skills
  • Should have proven analytical and problem solving skills
  • Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills
  • Should be a quick learner and team player
  • Should have experience of working under stringent deadlines in a Matrix organization structure
  • Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in past organizations


Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Noida
3 - 7 yrs
₹15L - ₹28L / yr
skill iconPython
FastAPI
Authentication
Google Cloud Platform (GCP)
ACL
+1 more

About Us


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.


About Role:


The Senior Python Developer will lead the design and implementation of ACL crawler connectors for Workato’s search platform. This role requires deep expertise in building scalable Python services, integrating with various SaaS APIs and designing robust data models. The developer will mentor junior team members and ensure that the solutions meet the technical and performance requirements outlined in the Statement of Work.


Key Responsibilities:


  • Architecture and design: Translate business requirements into technical designs for ACL crawler connectors. Define data models, API interactions and modular components using the Workato SDK.
  • Implementation: Build Python services to authenticate, enumerate domain entities and extract ACL information from OneDrive, ServiceNow, HubSpot and GitHub. Implement incremental sync, pagination, concurrency and caching.
  • Performance optimisation: Profile code, parallelise API calls and utilise asynchronous programming to meet crawl time SLAs. Implement retry logic and error handling for network‑bound operations.
  • Testing and code quality: Develop unit and integration tests, perform code reviews and enforce best practices (type hints, linting). Produce performance reports and documentation.
  • Mentoring and collaboration: Guide junior developers, collaborate with QA, DevOps and product teams, and participate in design reviews and sprint planning.
  • Hypercare support: Provide Level 2/3 support during the initial rollout, troubleshoot issues, implement minor enhancements and deliver knowledge transfer sessions.



Must Have Skills and Experiences:


  • Bachelor’s degree in Computer Science or related field.
  • 3-8 years of Python development experience, including asynchronous programming and API integration.
  • Knowledge of python libraries-pandas,pytest,requests,asyncio
  • Strong understanding of authentication protocols (OAuth 2.0, API keys) and access‑control models.
  • Experience with integration with cloud or SaaS platforms such as Microsoft Graph, ServiceNow REST API, HubSpot API, GitHub API.
  • Proven ability to lead projects and mentor other engineers.
  • Excellent communication skills and ability to produce clear documentation.



Optional/Good to Have Skills and Experiences:


  • Experience with integration with Microsoft Graph API, ServiceNow REST API, HubSpot API, GitHub API.
  • Familiarity with the following libraries, tools and technologies will be advantageous-aiohttp,PyJWT,aiofiles / aiocache
  • Experience with containerisation (Docker), CI/CD pipelines and Workato’s connector SDK is also considered a plus.



Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
Noida
3 - 9 yrs
₹15L - ₹32L / yr
Natural Language Processing (NLP)
Large Language Models (LLM) tuning
skill iconMachine Learning (ML)
Retrieval Augmented Generation (RAG)
skill iconPython
+1 more

About Us


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.


Job Summary:


We are seeking a highly innovative and skilled AI Engineer to join our AI CoE  for the Data Integration Project. The ideal candidate will be responsible for designing, developing, and deploying intelligent assets and AI agents that automate and optimize various stages of the data ingestion and integration pipeline. This role requires expertise in machine learning, natural language processing (NLP), knowledge representation, and cloud platform services, with a strong focus on building scalable and accurate AI solutions.


Key Responsibilities:

  • LLM-based Auto-schematization: Develop and refine LLM-based models and techniques for automatically inferring schemas from diverse unstructured and semi-structured public datasets and mapping them to a standardized vocabulary.
  • Entity Resolution & ID Generation AI: Design and implement AI models for highly accurate entity resolution, matching new entities with existing IDs and generating unique, standardized IDs for newly identified entities.
  • Automated Data Profiling & Schema Detection: Develop AI/ML accelerators for automated data profiling, pattern detection, and schema detection to understand data structure and quality at scale.
  • Anomaly Detection & Smart Imputation: Create AI-powered solutions for identifying outliers, inconsistencies, and corrupt records, and for intelligently filling missing values using machine learning algorithms.
  • Multilingual Data Integration AI: Develop AI assets for accurately interpreting, translating (leveraging automated tools with human-in-the-loop validation), and semantically mapping data from diverse linguistic sources, preserving meaning and context.
  • Validation Automation & Error Pattern Recognition: Build AI agents to run comprehensive data validation tool checks, identify common error types, suggest fixes, and automate common error corrections.
  • Knowledge Graph RAG/RIG Integration: Integrate Retrieval Augmented Generation (RAG) and Retrieval Augmented Indexing (RIG) techniques to enhance querying capabilities and facilitate consistency checks within the Knowledge Graph.
  • MLOps Implementation: Implement and maintain MLOps practices for the lifecycle management of AI models, including versioning, deployment, monitoring, and retraining on a relevant AI platform.
  • Code Generation & Documentation Automation: Develop AI tools for generating reusable scripts, templates, and comprehensive import documentation to streamline development.
  • Continuous Improvement Systems: Design and build learning systems, feedback loops, and error analytics mechanisms to continuously improve the accuracy and efficiency of AI-powered automation over time.

Required Skills and Qualifications:


  • Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field.
  • Proven experience (e.g., 3+ years) as an AI/ML Engineer, with a strong portfolio of deployed AI solutions.
  • Strong expertise in Natural Language Processing (NLP), including experience with Large Language Models (LLMs) and their applications in data processing.
  • Proficiency in Python and relevant AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn).
  • Hands-on experience with cloud AI/ML services, 
  • Understanding of knowledge representation, ontologies (e.g., Schema.org, RDF), and knowledge graphs.
  • Experience with data quality, validation, and anomaly detection techniques.
  • Familiarity with MLOps principles and practices for model deployment and lifecycle management.
  • Strong problem-solving skills and an ability to translate complex data challenges into AI solutions.
  • Excellent communication and collaboration skills.


Preferred Qualifications:


  • Experience with data integration projects, particularly with large-scale public datasets.
  • Familiarity with knowledge graph initiatives.
  • Experience with multilingual data processing and AI.
  • Contributions to open-source AI/ML projects.
  • Experience in an Agile development environment.


Benefits:


  • Opportunity to work on a high-impact project at the forefront of AI and data integration.
  • Contribute to solidifying a leading data initiative's role as a foundational source for grounding Large Models.
  • Access to cutting-edge cloud AI technologies.
  • Collaborative, innovative, and fast-paced work environment.
  • Significant impact on data quality and operational efficiency.


Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Remote only
4 - 8 yrs
₹25L - ₹32L / yr
API
Netsuite
workato
skill iconPython
skill iconAmazon Web Services (AWS)
+3 more

About Us :


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values :


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement :


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.


Role: Senior Integration Engineer


Location: Remote/Delhi NCR


Experience: 4-8 years


Position Overview :


We are seeking a Senior Integration Engineer with deep expertise in building and managing integrations across Finance, ERP, and business systems. The ideal candidate will have both technical proficiency and strong business understanding, enabling them to translate finance team needs into robust, scalable, and fault-tolerant solutions.


Key Responsibilities:


  • Design, develop, and maintain integrations between financial systems, ERPs, and related applications (e.g., expense management, commissions, accounting, sales)
  • Gather requirements from Finance and Business stakeholders, analyze pain points, and translate them into effective integration solutions
  • Build and support integrations using SOAP and REST APIs, ensuring reliability, scalability, and best practices for logging, error handling, and edge cases
  • Develop, debug, and maintain workflows and automations in platforms such as Workato and Exactly Connect
  • Support and troubleshoot NetSuite SuiteScript, Suiteflows, and related ERP customizations
  • Write, optimize, and execute queries for Zuora (ZQL, Business Objects) and support invoice template customization (HTML)
  • Implement integrations leveraging AWS (RDS, S3) and SFTP for secure and scalable data exchange
  • Perform database operations and scripting using Python and JavaScript for transformation, validation, and automation tasks
  • Provide functional support and debugging for finance tools such as Concur and Coupa
  • Ensure integration architecture follows best practices for fault tolerance, monitoring, and maintainability
  • Collaborate cross-functionally with Finance, Business, and IT teams to align technology solutions with business goals.



Qualifications:


  • 3-8 years of experience in software/system integration with strong exposure to Finance and ERP systems
  • Proven experience integrating ERP systems (e.g., NetSuite, Zuora, Coupa, Concur) with financial tools
  • Strong understanding of finance and business processes: accounting, commissions, expense management, sales operations
  • Hands-on experience with SOAP, REST APIs, Workato, AWS services, SFTP
  • Working knowledge of NetSuite SuiteScript, Suiteflows, and Zuora queries (ZQL, Business Objects, invoice templates)
  • Proficiency with databases, Python, JavaScript, and SQL query optimization
  • Familiarity with Concur and Coupa functionality
  • Strong debugging, problem-solving, and requirement-gathering skills
  • Excellent communication skills and ability to work with cross-functional business teams.



Preferred Skills:


  • Experience with integration design patterns and frameworks
  • Exposure to CI/CD pipelines for integration deployments
  • Knowledge of business and operations practices in financial systems and finance teams
Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Lishta Jain
Posted by Lishta Jain
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Noida
6 - 12 yrs
₹22L - ₹34L / yr
Natural Language Processing (NLP)
Large Language Models (LLM) tuning
Generative AI
skill iconPython
CI/CD
+4 more

About Us


CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.


Our Values


We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.


Equal Opportunity Statement


CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.


Role Overview:


As a Senior Data Scientist / AI Engineer, you will be a key player in our technical leadership. You will be responsible for designing, developing, and deploying sophisticated AI and Machine Learning solutions, with a strong emphasis on Generative AI and Large Language Models (LLMs). You will architect and manage scalable AI microservices, drive research into state-of-the-art techniques, and translate complex business requirements into tangible, high-impact products. This role requires a blend of deep technical expertise, strategic thinking, and leadership.


Key Responsibilities:


  • Architect & Develop AI Solutions: Design, build, and deploy robust and scalable machine learning models, with a primary focus on Natural Language Processing (NLP), Generative AI, and LLM-based Agents.
  • Build AI Infrastructure: Create and manage AI-driven microservices using frameworks like Python FastAPI, ensuring high performance and reliability.
  • Lead AI Research & Innovation: Stay abreast of the latest advancements in AI/ML. Lead research initiatives to evaluate and implement state-of-the-art models and techniques for performance and cost optimization.
  • Solve Business Problems: Collaborate with product and business teams to understand challenges and develop data-driven solutions that create significant business value, such as building business rule engines or predictive classification systems.
  • End-to-End Project Ownership: Take ownership of the entire lifecycle of AI projects—from ideation, data processing, and model development to deployment, monitoring, and iteration on cloud platforms.
  • Team Leadership & Mentorship: Lead learning initiatives within the engineering team, mentor junior data scientists and engineers, and establish best practices for AI development.
  • Cross-Functional Collaboration: Work closely with software engineers to integrate AI models into production systems and contribute to the overall system architecture.

Required Skills and Qualifications


  • Master’s (M.Tech.) or Bachelor's (B.Tech.) degree in Computer Science, Artificial Intelligence, Information Technology, or a related field.
  • 6+ years of professional experience in a Data Scientist, AI Engineer, or related role.
  • Expert-level proficiency in Python and its core data science libraries (e.g., PyTorch, Huggingface Transformers, Pandas, Scikit-learn).
  • Demonstrable, hands-on experience building and fine-tuning Large Language Models (LLMs) and implementing Generative AI solutions.
  • Proven experience in developing and deploying scalable systems on cloud platforms, particularly AWS. Experience with GCS is a plus.
  • Strong background in Natural Language Processing (NLP), including experience with multilingual models and transcription.
  • Experience with containerization technologies, specifically Docker.
  • Solid understanding of software engineering principles and experience building APIs and microservices.


Preferred Qualifications


  • A strong portfolio of projects. A track record of publications in reputable AI/ML conferences is a plus.
  • Experience with full-stack development (Node.js, Next.js) and various database technologies (SQL, MongoDB, Elasticsearch).
  • Familiarity with setting up and managing CI/CD pipelines (e.g., Jenkins).
  • Proven ability to lead technical teams and mentor other engineers.
  • Experience developing custom tools or packages for data science workflows.


Read more
CLOUDSUFI
at CLOUDSUFI
3 recruiters
Amruta Vaishampayan
Posted by Amruta Vaishampayan
icon

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Remote only
3 - 7 yrs
₹3L - ₹23L / yr
skill iconJava
RESTful APIs
SOAP
skill iconXML
JSON
+2 more
Role: Full-Time Individual Contributor (IC)
Reporting to: Solution Architect / Program Manager / COE Head
Location: Noida, Delhi NCR
Shift: Normal Day shift with some overlap with US timezones

Experience: 4-7 years
Education: BTech / BE / MCA / MSc Computer Science
Industry: Product Engineering Services or Enterprise Software Companies
Primary Skills - Java 8/9, Core Java, Design patterns (more than Singleton & Factory),
Webservices development REST/SOAP, XML & JSON manipulation, CI/CD.
Secondary Skills - Jenkins, Kubernetes, Google Cloud Platform (GCP), SAP JCo library
Certifications (Optional): OCPJP (the Oracle Certified Professional Java Programmer) / Google
Professional Cloud Developer

Required Experience:
● Must have integration component development experience using Java 8/9 technologies
and service-oriented architecture (SOA)
● Must have in-depth knowledge of design patterns and integration architecture
● Experience with developing solutions on Google Cloud Platform will be an added
advantage.
● Should have good hands-on experience with Software Engineering tools viz. Eclipse,
NetBeans, JIRA, Confluence, BitBucket, SVN etc.
● Should be very well verse with current technology trends in IT Solutions e.g. Cloud
Platform Development, DevOps, Low Code solutions, Intelligent Automation
Good to Have:
● Experience of developing 3-4 integration adapters/connectors for enterprise applications
(ERP, CRM, HCM, SCM, Billing etc.) using industry standard frameworks and
methodologies following Agile/Scrum

Non-Technical/ Behavioral competencies required:
● Must have worked with US/Europe based clients in onsite/offshore delivery model
● Should have very good verbal and written communication, technical articulation, listening
and presentation skills
● Should have proven analytical and problem solving skills
● Should have demonstrated effective task prioritization, time management and
internal/external stakeholder management skills
● Should be a quick learner, self starter, go-getter and team player
● Should have experience of working under stringent deadlines in a Matrix organization
structure
● Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in
past organizations

Job Responsibilities:
● Writing the design specifications and user stories for the functionalities assigned.
● Develop assigned components / classes and assist QA team in writing the test cases
● Create and maintain coding best practices and do peer code / solution reviews
● Participate in Daily Scrum calls, Scrum Planning, Retro and Demos meetings
● Bring out technical/design/architectural challenges/risks during execution, develop action
plan for mitigation and aversion of identified risks
● Comply with development processes, documentation templates and tools prescribed by
CloudSufi or and its clients
● Work with other teams and Architects in the organization and assist them on technical
Issues/Demos/POCs and proposal writing for prospective clients
● Contribute towards the creation of knowledge repository, reusable assets/solution
accelerators and IPs
● Provide feedback to junior developers and be a coach and mentor for them
● Provide training sessions on the latest technologies and topics to others employees in
the organization
● Participate in organization development activities time to time - Interviews,
CSR/Employee engagement activities, participation in business events/conferences,
implementation of new policies, systems and procedures as decided by Management team.
Read more
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo

Similar companies

Appknox cover picture
Appknox's logo

Appknox

https://appknox.com
Founded
2014
Type
Product
Size
20-100
Stage
Profitable

About the company

Appknox, a leading mobile app security solution HQ in Singapore & Bangalore was founded by Harshit Agarwal and Subho Halder.


Since its inception, Appknox has become one of the go-to security solutions with the most powerful plug-and-play security platform, enabling security researchers, developers, and enterprises to build safe and secure mobile ecosystems using a system-plus human approach.


Appknox offers VA+PT solutions ( Vulnerability Assessment + Penetration Testing ) that provide end-to-end mobile application security and testing strategies to Fortune 500, SMB and Large Enterprises Globally helping businesses and mobile developers make their mobile apps more secure, thus not only enhancing protection for their customers but also for their own brand. 


During the course of 9 years, Appknox has scaled up to work with some major brands in India, South-East Asia, Middle-East, Japan, and the US and has also successfully enabled some of the top government agencies with its On-Premise deployments & compliance testing. Appknox helps 500+ Enterprises which includes 20+ Fortune 1000 and ministries/regulators across 10+ countries and some of the top banks across 20+ countries.


A champion of Value SaaS, with its customer and security-first approach Appknox has won many awards and recognitions from G2, and Gartner and is one of the top mobile app security vendors in its 2021 Application security Hype Cycle report. 


Our forward-leaning, pioneering spirit is backed by SeedPlus, JFDI Asia, Microsoft Ventures, and Cisco Launchpad and a legacy of expertise that began at the dawn of 2014.

Jobs

8

Springer Capital cover picture
Springer Capital's logo

Springer Capital

https://springer.capital
Founded
2015
Type
Products & Services
Size
100-1000
Stage
Profitable

About the company

Jobs

54

Talentfoxhr cover picture
Talentfoxhr's logo

Talentfoxhr

https://talentfoxhr.com
Founded
2025
Type
Services
Size
0-10
Stage
Bootstrapped

About the company

Jobs

3

Founded
2024
Type
Products & Services
Size
20-100
Stage
Raised funding

About the company

We envision a future where flexible work empowers professionals and businesses alike. TalentLo unites innovation, talent, and opportunities to create impactful solutions.

Jobs

4

Growing Stars Consulting India cover picture
Growing Stars Consulting India's logo

Growing Stars Consulting India

https://growingstarsconsulting.in
Founded
2023
Type
Services
Size
20-100
Stage
Bootstrapped

About the company

Jobs

3

vaaragonenterprisescom cover picture
vaaragonenterprisescom's logo

vaaragonenterprisescom

https://vaaragonenterprises.com
Founded
2023
Type
Services
Size
0-20
Stage
Bootstrapped

About the company

Jobs

22

Founded
2025
Type
Services
Size
0-10
Stage
Bootstrapped

About the company

Peak Hire Solutions is a leading Recruitment Firm that provides our clients with innovative IT / Non-IT Recruitment Solutions. We pride ourselves on our creativity, quality, and professionalism. Join our team and be a part of shaping the future of Recruitment.

Jobs

185

RTTNews India cover picture
RTTNews India's logo

RTTNews India

https://rttnews.com
Founded
1999
Type
Products & Services
Size
20-100
Stage
Profitable

About the company

RTTNews is a US-headquartered financial media company with 25+ years in the industry. We deliver real-time news, market analysis, and commentary to Fortune 500 companies, global banks, brokerages, and leading news portals worldwide.

Founded in 2000 and based in New York, we operate Indian bureaus in Chennai and Kochi. Our digital marketing division, RTT Digital Signage, serves the rapidly growing digital signage industry with creative content solutions for clients across the globe.

Jobs

1

Founded
2025
Type
Product
Size
0-20
Stage
Bootstrapped

About the company

Redefining fashion discovery

Jobs

1

Alpheva AI cover picture
Alpheva AI's logo

Alpheva AI

https://alpheva.com
Founded
2025
Type
Product
Size
0-20
Stage
Raised funding

About the company

Get AI-powered insights that help you make smarter money moves & save more!

Jobs

1

Want to work at CLOUDSUFI?
CLOUDSUFI's logo
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs