

CLOUDSUFI
https://www.cloudsufi.comAbout
We exist to eliminate the gap between “Human Intuition” and “Data-Backed Decisions”
Data is the new oxygen, and we believe no organization can live without it. We partner with our customers to get to the core of their problems, enable the data supply chain and help them monetize their data. We make enterprise data dance!
Our work elevates the quality of lives for our family, customers, partners and the community.
The human values that we display in all our interactions are of:
Passion – we are committed in heart and head
Integrity – we are real, honest and, fair
Empathy – we understand business isn’t just B2B, or B2C, it is H2H i.e. Human to Human
Boldness – we have the courage to think and do differently
The CLOUDSUFI Foundation embraces the power of legacy and wisdom of those who have helped laid the foundation for all of us, our seniors. We believe in their abilities and we pledge to equip them, to provide them jobs, and to bring them sufi joy.
Tech stack





Connect with the team
Jobs at CLOUDSUFI


About Us
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.
About Role:
The Senior Python Developer will lead the design and implementation of ACL crawler connectors for Workato’s search platform. This role requires deep expertise in building scalable Python services, integrating with various SaaS APIs and designing robust data models. The developer will mentor junior team members and ensure that the solutions meet the technical and performance requirements outlined in the Statement of Work.
Key Responsibilities:
- Architecture and design: Translate business requirements into technical designs for ACL crawler connectors. Define data models, API interactions and modular components using the Workato SDK.
- Implementation: Build Python services to authenticate, enumerate domain entities and extract ACL information from OneDrive, ServiceNow, HubSpot and GitHub. Implement incremental sync, pagination, concurrency and caching.
- Performance optimisation: Profile code, parallelise API calls and utilise asynchronous programming to meet crawl time SLAs. Implement retry logic and error handling for network‑bound operations.
- Testing and code quality: Develop unit and integration tests, perform code reviews and enforce best practices (type hints, linting). Produce performance reports and documentation.
- Mentoring and collaboration: Guide junior developers, collaborate with QA, DevOps and product teams, and participate in design reviews and sprint planning.
- Hypercare support: Provide Level 2/3 support during the initial rollout, troubleshoot issues, implement minor enhancements and deliver knowledge transfer sessions.
Must Have Skills and Experiences:
- Bachelor’s degree in Computer Science or related field.
- 3-8 years of Python development experience, including asynchronous programming and API integration.
- Knowledge of python libraries-pandas,pytest,requests,asyncio
- Strong understanding of authentication protocols (OAuth 2.0, API keys) and access‑control models.
- Experience with integration with cloud or SaaS platforms such as Microsoft Graph, ServiceNow REST API, HubSpot API, GitHub API.
- Proven ability to lead projects and mentor other engineers.
- Excellent communication skills and ability to produce clear documentation.
Optional/Good to Have Skills and Experiences:
- Experience with integration with Microsoft Graph API, ServiceNow REST API, HubSpot API, GitHub API.
- Familiarity with the following libraries, tools and technologies will be advantageous-aiohttp,PyJWT,aiofiles / aiocache
- Experience with containerisation (Docker), CI/CD pipelines and Workato’s connector SDK is also considered a plus.



About Us
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.
Job Summary:
We are seeking a highly innovative and skilled AI Engineer to join our AI CoE for the Data Integration Project. The ideal candidate will be responsible for designing, developing, and deploying intelligent assets and AI agents that automate and optimize various stages of the data ingestion and integration pipeline. This role requires expertise in machine learning, natural language processing (NLP), knowledge representation, and cloud platform services, with a strong focus on building scalable and accurate AI solutions.
Key Responsibilities:
- LLM-based Auto-schematization: Develop and refine LLM-based models and techniques for automatically inferring schemas from diverse unstructured and semi-structured public datasets and mapping them to a standardized vocabulary.
- Entity Resolution & ID Generation AI: Design and implement AI models for highly accurate entity resolution, matching new entities with existing IDs and generating unique, standardized IDs for newly identified entities.
- Automated Data Profiling & Schema Detection: Develop AI/ML accelerators for automated data profiling, pattern detection, and schema detection to understand data structure and quality at scale.
- Anomaly Detection & Smart Imputation: Create AI-powered solutions for identifying outliers, inconsistencies, and corrupt records, and for intelligently filling missing values using machine learning algorithms.
- Multilingual Data Integration AI: Develop AI assets for accurately interpreting, translating (leveraging automated tools with human-in-the-loop validation), and semantically mapping data from diverse linguistic sources, preserving meaning and context.
- Validation Automation & Error Pattern Recognition: Build AI agents to run comprehensive data validation tool checks, identify common error types, suggest fixes, and automate common error corrections.
- Knowledge Graph RAG/RIG Integration: Integrate Retrieval Augmented Generation (RAG) and Retrieval Augmented Indexing (RIG) techniques to enhance querying capabilities and facilitate consistency checks within the Knowledge Graph.
- MLOps Implementation: Implement and maintain MLOps practices for the lifecycle management of AI models, including versioning, deployment, monitoring, and retraining on a relevant AI platform.
- Code Generation & Documentation Automation: Develop AI tools for generating reusable scripts, templates, and comprehensive import documentation to streamline development.
- Continuous Improvement Systems: Design and build learning systems, feedback loops, and error analytics mechanisms to continuously improve the accuracy and efficiency of AI-powered automation over time.
Required Skills and Qualifications:
- Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related quantitative field.
- Proven experience (e.g., 3+ years) as an AI/ML Engineer, with a strong portfolio of deployed AI solutions.
- Strong expertise in Natural Language Processing (NLP), including experience with Large Language Models (LLMs) and their applications in data processing.
- Proficiency in Python and relevant AI/ML libraries (e.g., TensorFlow, PyTorch, scikit-learn).
- Hands-on experience with cloud AI/ML services,
- Understanding of knowledge representation, ontologies (e.g., Schema.org, RDF), and knowledge graphs.
- Experience with data quality, validation, and anomaly detection techniques.
- Familiarity with MLOps principles and practices for model deployment and lifecycle management.
- Strong problem-solving skills and an ability to translate complex data challenges into AI solutions.
- Excellent communication and collaboration skills.
Preferred Qualifications:
- Experience with data integration projects, particularly with large-scale public datasets.
- Familiarity with knowledge graph initiatives.
- Experience with multilingual data processing and AI.
- Contributions to open-source AI/ML projects.
- Experience in an Agile development environment.
Benefits:
- Opportunity to work on a high-impact project at the forefront of AI and data integration.
- Contribute to solidifying a leading data initiative's role as a foundational source for grounding Large Models.
- Access to cutting-edge cloud AI technologies.
- Collaborative, innovative, and fast-paced work environment.
- Significant impact on data quality and operational efficiency.

We are seeking a talented and passionate Data Engineer to join our growing data team. In this role, you will be responsible for building, maintaining, and optimizing our data pipelines and infrastructure on Google Cloud Platform (GCP). The ideal candidate will have a strong background in data warehousing, ETL/ELT processes, and a passion for turning raw data into actionable insights. You will work closely with data scientists, analysts, and other engineers to support a variety of data-driven initiatives.
Responsibilities:
• Design, develop, and maintain scalable and reliable data pipelines using Dataform or DBT.
• Build and optimize data warehousing solutions on Google BigQuery.
• Develop and manage data workflows using Apache Airflow.
• Write complex and efficient SQL queries for data extraction, transformation, and analysis.
• Develop Python-based scripts and applications for data processing and automation.
• Collaborate with data scientists and analysts to understand their data requirements and provide solutions.
• Implement data quality checks and monitoring to ensure data accuracy and consistency.
• Optimize data pipelines for performance, scalability, and cost-effectiveness.
• Contribute to the design and implementation of data infrastructure best practices.
• Troubleshoot and resolve data-related issues.
• Stay up-to-date with the latest data engineering trends and technologies, particularly within the Google Cloud ecosystem.
Qualifications:
• Bachelor's degree in Computer Science, a related technical field, or equivalent practical experience.
• 3-4 years of experience in a Data Engineer role.
• Strong expertise in SQL (preferably with BigQuery SQL).
• Proficiency in Python programming for data manipulation and automation.
• Hands-on experience with Google Cloud Platform (GCP) and its data services.
• Solid understanding of data warehousing concepts and ETL/ELT methodologies.
• Experience with Dataform or DBT for data transformation and modeling.
• Experience with workflow management tools such as Apache Airflow.
• Excellent problem-solving and analytical skills.
• Strong communication and collaboration skills.
• Ability to work independently and as part of a team.
Preferred Qualifications:
• Google Cloud Professional Data Engineer certification.
• Knowledge of data modeling techniques (e.g., dimensional modeling, star schema).
• Familiarity with Agile development methodologies


About Us :
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values :
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement :
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.
Role: Senior Integration Engineer
Location: Remote/Delhi NCR
Experience: 4-8 years
Position Overview :
We are seeking a Senior Integration Engineer with deep expertise in building and managing integrations across Finance, ERP, and business systems. The ideal candidate will have both technical proficiency and strong business understanding, enabling them to translate finance team needs into robust, scalable, and fault-tolerant solutions.
Key Responsibilities:
- Design, develop, and maintain integrations between financial systems, ERPs, and related applications (e.g., expense management, commissions, accounting, sales)
- Gather requirements from Finance and Business stakeholders, analyze pain points, and translate them into effective integration solutions
- Build and support integrations using SOAP and REST APIs, ensuring reliability, scalability, and best practices for logging, error handling, and edge cases
- Develop, debug, and maintain workflows and automations in platforms such as Workato and Exactly Connect
- Support and troubleshoot NetSuite SuiteScript, Suiteflows, and related ERP customizations
- Write, optimize, and execute queries for Zuora (ZQL, Business Objects) and support invoice template customization (HTML)
- Implement integrations leveraging AWS (RDS, S3) and SFTP for secure and scalable data exchange
- Perform database operations and scripting using Python and JavaScript for transformation, validation, and automation tasks
- Provide functional support and debugging for finance tools such as Concur and Coupa
- Ensure integration architecture follows best practices for fault tolerance, monitoring, and maintainability
- Collaborate cross-functionally with Finance, Business, and IT teams to align technology solutions with business goals.
Qualifications:
- 3-8 years of experience in software/system integration with strong exposure to Finance and ERP systems
- Proven experience integrating ERP systems (e.g., NetSuite, Zuora, Coupa, Concur) with financial tools
- Strong understanding of finance and business processes: accounting, commissions, expense management, sales operations
- Hands-on experience with SOAP, REST APIs, Workato, AWS services, SFTP
- Working knowledge of NetSuite SuiteScript, Suiteflows, and Zuora queries (ZQL, Business Objects, invoice templates)
- Proficiency with databases, Python, JavaScript, and SQL query optimization
- Familiarity with Concur and Coupa functionality
- Strong debugging, problem-solving, and requirement-gathering skills
- Excellent communication skills and ability to work with cross-functional business teams.
Preferred Skills:
- Experience with integration design patterns and frameworks
- Exposure to CI/CD pipelines for integration deployments
- Knowledge of business and operations practices in financial systems and finance teams


About Us
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/.
Role Overview:
As a Senior Data Scientist / AI Engineer, you will be a key player in our technical leadership. You will be responsible for designing, developing, and deploying sophisticated AI and Machine Learning solutions, with a strong emphasis on Generative AI and Large Language Models (LLMs). You will architect and manage scalable AI microservices, drive research into state-of-the-art techniques, and translate complex business requirements into tangible, high-impact products. This role requires a blend of deep technical expertise, strategic thinking, and leadership.
Key Responsibilities:
- Architect & Develop AI Solutions: Design, build, and deploy robust and scalable machine learning models, with a primary focus on Natural Language Processing (NLP), Generative AI, and LLM-based Agents.
- Build AI Infrastructure: Create and manage AI-driven microservices using frameworks like Python FastAPI, ensuring high performance and reliability.
- Lead AI Research & Innovation: Stay abreast of the latest advancements in AI/ML. Lead research initiatives to evaluate and implement state-of-the-art models and techniques for performance and cost optimization.
- Solve Business Problems: Collaborate with product and business teams to understand challenges and develop data-driven solutions that create significant business value, such as building business rule engines or predictive classification systems.
- End-to-End Project Ownership: Take ownership of the entire lifecycle of AI projects—from ideation, data processing, and model development to deployment, monitoring, and iteration on cloud platforms.
- Team Leadership & Mentorship: Lead learning initiatives within the engineering team, mentor junior data scientists and engineers, and establish best practices for AI development.
- Cross-Functional Collaboration: Work closely with software engineers to integrate AI models into production systems and contribute to the overall system architecture.
Required Skills and Qualifications
- Master’s (M.Tech.) or Bachelor's (B.Tech.) degree in Computer Science, Artificial Intelligence, Information Technology, or a related field.
- 6+ years of professional experience in a Data Scientist, AI Engineer, or related role.
- Expert-level proficiency in Python and its core data science libraries (e.g., PyTorch, Huggingface Transformers, Pandas, Scikit-learn).
- Demonstrable, hands-on experience building and fine-tuning Large Language Models (LLMs) and implementing Generative AI solutions.
- Proven experience in developing and deploying scalable systems on cloud platforms, particularly AWS. Experience with GCS is a plus.
- Strong background in Natural Language Processing (NLP), including experience with multilingual models and transcription.
- Experience with containerization technologies, specifically Docker.
- Solid understanding of software engineering principles and experience building APIs and microservices.
Preferred Qualifications
- A strong portfolio of projects. A track record of publications in reputable AI/ML conferences is a plus.
- Experience with full-stack development (Node.js, Next.js) and various database technologies (SQL, MongoDB, Elasticsearch).
- Familiarity with setting up and managing CI/CD pipelines (e.g., Jenkins).
- Proven ability to lead technical teams and mentor other engineers.
- Experience developing custom tools or packages for data science workflows.
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/
What are we looking for
We are seeking a highly skilled and experienced Senior DevOps Engineer to join our team. The ideal candidate will have extensive expertise in modern DevOps tools and practices, particularly in managing CI/CD pipelines, infrastructure as code, and cloud-native environments. This role involves designing, implementing, and maintaining robust, scalable, and efficient infrastructure and deployment pipelines to support our development and operations teams.
Required Skills and Experience:
- 7+ years of experience in DevOps, infrastructure automation, or related fields.
- Advanced expertise in Terraform for infrastructure as code.
- Solid experience with Helm for managing Kubernetes applications.
- Proficient with GitHub for version control, repository management, and workflows.
- Extensive experience with Kubernetes for container orchestration and management.
- In-depth understanding of Google Cloud Platform (GCP) services and architecture.
- Strong scripting and automation skills (e.g., Python, Bash, or equivalent).
- Excellent problem-solving skills and attention to detail. - Strong communication and collaboration abilities in agile development environments.
Preferred Qualifications:
- Experience with other CI/CD tools (e.g., Jenkins, GitLab CI/CD).
- Knowledge of additional cloud platforms (e.g., AWS, Azure).
- Certification in Kubernetes (CKA/CKAD) or Google Cloud (GCP Professional DevOps Engineer).
Behavioral Competencies
• Must have worked with US/Europe based clients in onsite/offshore delivery models.
• Should have very good verbal and written communication, technical articulation, listening and presentation skills.
• Should have proven analytical and problem solving skills.
• Should have collaborative mindset for cross-functional team work
• Passion for solving complex search problems
• Should have demonstrated effective task prioritization, time management and internal/external stakeholder management skills.
• Should be a quick learner, self starter, go-getter and team player.
• Should have experience of working under stringent deadlines in a Matrix organization structure.

The recruiter has not been active on this job recently. You may apply but please expect a delayed response.

Reporting to: Solution Architect / Program Manager / COE Head
Location: Noida, Delhi NCR
Shift: Normal Day shift with some overlap with US timezones
Experience: 4-7 years
Education: BTech / BE / MCA / MSc Computer Science
Industry: Product Engineering Services or Enterprise Software Companies
Primary Skills - Java 8/9, Core Java, Design patterns (more than Singleton & Factory),
Webservices development REST/SOAP, XML & JSON manipulation, CI/CD.
Secondary Skills - Jenkins, Kubernetes, Google Cloud Platform (GCP), SAP JCo library
Certifications (Optional): OCPJP (the Oracle Certified Professional Java Programmer) / Google
Professional Cloud Developer
Required Experience:
● Must have integration component development experience using Java 8/9 technologies
and service-oriented architecture (SOA)
● Must have in-depth knowledge of design patterns and integration architecture
● Experience with developing solutions on Google Cloud Platform will be an added
advantage.
● Should have good hands-on experience with Software Engineering tools viz. Eclipse,
NetBeans, JIRA, Confluence, BitBucket, SVN etc.
● Should be very well verse with current technology trends in IT Solutions e.g. Cloud
Platform Development, DevOps, Low Code solutions, Intelligent Automation
Good to Have:
● Experience of developing 3-4 integration adapters/connectors for enterprise applications
(ERP, CRM, HCM, SCM, Billing etc.) using industry standard frameworks and
methodologies following Agile/Scrum
Non-Technical/ Behavioral competencies required:
● Must have worked with US/Europe based clients in onsite/offshore delivery model
● Should have very good verbal and written communication, technical articulation, listening
and presentation skills
● Should have proven analytical and problem solving skills
● Should have demonstrated effective task prioritization, time management and
internal/external stakeholder management skills
● Should be a quick learner, self starter, go-getter and team player
● Should have experience of working under stringent deadlines in a Matrix organization
structure
● Should have demonstrated appreciable Organizational Citizenship Behavior (OCB) in
past organizations
Job Responsibilities:
● Writing the design specifications and user stories for the functionalities assigned.
● Develop assigned components / classes and assist QA team in writing the test cases
● Create and maintain coding best practices and do peer code / solution reviews
● Participate in Daily Scrum calls, Scrum Planning, Retro and Demos meetings
● Bring out technical/design/architectural challenges/risks during execution, develop action
plan for mitigation and aversion of identified risks
● Comply with development processes, documentation templates and tools prescribed by
CloudSufi or and its clients
● Work with other teams and Architects in the organization and assist them on technical
Issues/Demos/POCs and proposal writing for prospective clients
● Contribute towards the creation of knowledge repository, reusable assets/solution
accelerators and IPs
● Provide feedback to junior developers and be a coach and mentor for them
● Provide training sessions on the latest technologies and topics to others employees in
the organization
● Participate in organization development activities time to time - Interviews,
CSR/Employee engagement activities, participation in business events/conferences,
implementation of new policies, systems and procedures as decided by Management team.

Similar companies
About the company
Fractal is one of the most prominent players in the Artificial Intelligence space.Fractal's mission is to power every human decision in the enterprise and brings Al, engineering, and design to help the world's most admire Fortune 500® companies.
Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Manage- ment, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs.
Fractal has more than 3600 employees across 16 global locations, including the United States, UK, Ukraine, India, Singapore, and Australia. Fractal has consistently been rated as India's best companies to work for, by The Great Place to Work® Institute, featured as a leader in Customer Analytics Service Providers Wave™ 2021, Computer Vision Consultancies Wave™ 2020 & Specialized Insights Service Providers Wave™ 2020 by Forrester Research, a leader in Analytics & Al Services Specialists Peak Matrix 2021 by Everest Group and recognized as an "Honorable Vendor" in 2022 Magic Quadrant™™ for data & analytics by Gartner. For more information, visit fractal.ai
Jobs
1
About the company
Jobs
6
About the company
Interview as a Service and AI Video Interview Software help companies streamline hiring process. Save 80% cost, 4x faster and ensure top candidate quality.
Jobs
21
About the company
Jobs
3
About the company
Jobs
1
About the company
Bhanzu is a revolutionary math-learning platform that is dedicated to inspiring learners to pursue careers in math and STEM fields. With a mission to eradicate global math phobia, Bhanzu is changing the way students perceive math for a better world. The platform offers a comprehensive range of math courses, designed to cater to learners of all ages and skill levels.
Jobs
14
About the company
Welcome to Neogencode Technologies, an IT services and consulting firm that provides innovative solutions to help businesses achieve their goals. Our team of experienced professionals is committed to providing tailored services to meet the specific needs of each client. Our comprehensive range of services includes software development, web design and development, mobile app development, cloud computing, cybersecurity, digital marketing, and skilled resource acquisition. We specialize in helping our clients find the right skilled resources to meet their unique business needs. At Neogencode Technologies, we prioritize communication and collaboration with our clients, striving to understand their unique challenges and provide customized solutions that exceed their expectations. We value long-term partnerships with our clients and are committed to delivering exceptional service at every stage of the engagement. Whether you are a small business looking to improve your processes or a large enterprise seeking to stay ahead of the competition, Neogencode Technologies has the expertise and experience to help you succeed. Contact us today to learn more about how we can support your business growth and provide skilled resources to meet your business needs.
Jobs
253
About the company
Shoppin' - if Pinterest and Google had an AI-powered baby 🍓
Here’s why:
shoppin' isn't just another AI powered tech platform.
we’re crafting a groundbreaking fashion search and shopping platform, one that outshines Google’s search capabilities, offers more intuitive recommendations than Pinterest, and engages users more deeply than Instagram’s feeds.
Our vision is bold — we’re not just building a foundational ai for fashion; we're aiming for the stars with a full-fledged fashion AGI.
Achieving such heights demands a remarkable team. Over the past month, our office has buzzed with energy and ideas, pushing us closer to crafting a user experience that feels as natural as it is innovative.
🚀 We have raised a funding of $1M pre-seed and will be raising another round very soon. Our aim is built the best founding team out there and we are scaling up our hiring efforts rapidly.
🍓Apply now to become a part of our creative force dedicated to shaping the future of fashion.
Jobs
1
About the company
Jobs
1
About the company
Jobs
1