
CloudSufi | Urgent Hiring: Data Engineer (Strong GCP Exp. Required)
at CLOUDSUFI
About Us:
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Job Summary:
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities:
- ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
- Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
- Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
- Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
- API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
- Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
- Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
- Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills:
- Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
- Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
- Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
- Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
- Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
- Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
- Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
- Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
- Experience with data validation techniques and tools.
- Familiarity with CI/CD practices and the ability to work in an Agile framework.
- Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
- Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
- Familiarity with similar large-scale public dataset integration initiatives.
- Experience with multilingual data integration.

About CLOUDSUFI
About
We exist to eliminate the gap between “Human Intuition” and “Data-Backed Decisions”
Data is the new oxygen, and we believe no organization can live without it. We partner with our customers to get to the core of their problems, enable the data supply chain and help them monetize their data. We make enterprise data dance!
Our work elevates the quality of lives for our family, customers, partners and the community.
The human values that we display in all our interactions are of:
Passion – we are committed in heart and head
Integrity – we are real, honest and, fair
Empathy – we understand business isn’t just B2B, or B2C, it is H2H i.e. Human to Human
Boldness – we have the courage to think and do differently
The CLOUDSUFI Foundation embraces the power of legacy and wisdom of those who have helped laid the foundation for all of us, our seniors. We believe in their abilities and we pledge to equip them, to provide them jobs, and to bring them sufi joy.
Tech stack
Connect with the team
Similar jobs
Job Title : QA Engineer – Web3/Blockchain
Experience : 3+ Years
Location : Noida Sector 18
Job Type : Full Time
Job Summary :
We are seeking a skilled QA Engineer with hands-on experience in testing Blockchain/Web3 projects such as wallets, DApps, and related decentralized applications. The ideal candidate should be detail-oriented, proactive, and passionate about ensuring the quality and reliability of Web3 solutions.
Mandatory Skills : QA testing of Web3 wallets, DApps, smart contracts, blockchain transactions, and API testing (Postman/Swagger).
Responsibilities :
- Design, develop, and execute test cases for Web3 wallets, DApps, and smart contract integrations.
- Perform functional, regression, API, and end-to-end testing of blockchain applications.
- Validate transactions, gas fees, wallet functionalities, and blockchain data consistency.
- Collaborate with developers, product managers, and other stakeholders to identify and resolve issues.
- Work with automation tools (if applicable) to enhance QA efficiency.
Requirements :
- 3+ years of QA/testing experience with a focus on Blockchain/Web3 applications.
- Strong knowledge of wallets (MetaMask, Trust Wallet, etc.) and DApps.
- Understanding of smart contracts, Ethereum, and blockchain fundamentals.
- Experience with API testing tools (Postman/Swagger) and bug tracking systems (Jira, etc.).
- Good problem-solving and communication skills.
Key Responsibilities:
- Lead the architecture, design, and implementation of scalable, secure, and highly available AWS infrastructure leveraging services such as VPC, EC2, IAM, S3, SNS/SQS, EKS, KMS, and Secrets Manager.
- Develop and maintain reusable, modular IaC frameworks using Terraform and Terragrunt, and mentor team members on IaC best practices.
- Drive automation of infrastructure provisioning, deployment workflows, and routine operations through advanced Python scripting.
- Take ownership of cost optimization strategy by analyzing usage patterns, identifying savings opportunities, and implementing guardrails across multiple AWS environments.
- Define and enforce infrastructure governance, including secure access controls, encryption policies, and secret management mechanisms.
- Collaborate cross-functionally with development, QA, and operations teams to streamline and scale CI/CD pipelines for containerized microservices on Kubernetes (EKS).
- Establish monitoring, alerting, and observability practices to ensure platform health, resilience, and performance.
- Serve as a technical mentor and thought leader, guiding junior engineers and shaping cloud adoption and DevOps culture across the organization.
- Evaluate emerging technologies and tools, recommending improvements to enhance system performance, reliability, and developer productivity.
- Ensure infrastructure complies with security, regulatory, and operational standards, and drive initiatives around audit readiness and compliance.
Mandatory Skills & Experience:
- AWS (Advanced Expertise): VPC, EC2, IAM, S3, SNS/SQS, EKS, KMS, Secrets Management
- Infrastructure as Code: Extensive experience with Terraform and Terragrunt, including module design and IaC strategy
- Strong hold in Kubernetes
- Scripting & Automation: Proficient in Python, with a strong track record of building tools, automating workflows, and integrating cloud services
- Cloud Cost Optimization: Proven ability to analyze cloud spend and implement sustainable cost control strategies
- Leadership: Experience in leading DevOps/infrastructure teams or initiatives, mentoring engineers, and making architecture-level decisions
Nice to Have:
- Experience designing or managing CI/CD pipelines for Kubernetes-based environments
- Backend development background in Python (e.g., FastAPI, Flask)
- Familiarity with monitoring/observability tools such as Prometheus, Grafana, CloudWatch
- Understanding of system performance tuning, capacity planning, and scalability best practices
- Exposure to compliance standards such as SOC 2, HIPAA, or ISO 27001
We are currently seeking a passionate and dynamic HR Recruiter to join our team and help us find the best talent to support our mission.
Job Summary:
The HR Recruiter will be responsible for sourcing, screening, and selecting qualified candidates for various positions within the organization. This role will require a deep understanding of recruitment strategies, excellent communication skills, and the ability to build relationships with hiring managers and candidates alike.
Key Responsibilities:
Collaborate with hiring managers to understand staffing needs and develop effective recruitment strategies.
Create and post job advertisements on various platforms and the company website.
Source candidates using various methods, including social media, job boards, and networking.
Review resumes, conduct initial phone screens, and coordinate interviews with candidates.
Maintain a talent pool of potential candidates for future openings.
Conduct reference checks and background verification as required.
Prepare and extend job offers to selected candidates.
Assist in developing and implementing recruitment processes and policies.
Provide a positive candidate experience throughout the hiring process.
Stay updated on industry trends and recruitment best practices.
Qualifications:
Bachelor’s degree in Human Resources, Business Administration, or a related field.
years of experience in recruitment or talent acquisition.
Proficiency in applicant tracking systems (ATS) and HR software.
Strong knowledge of employment laws and regulations.
Excellent communication and interpersonal skills.
Ability to work collaboratively in a team environment and manage multiple priorities.
Strong organizational and problem-solving abilities.
We’re looking for iOS Mobile Application Developer who has solid knowledge of iOS application’s life cycle, specially in modern mobile application. We need someone to build the native applications for iOS using Swift & Objective-C on Xcode. You’ll need to create applications from scratch or configure the existing applications.
- RESPONSIBILITIES:
i. Design and implement applications from initial concept, app architecture, and user interface to finished deliverable.
ii. Implement new features, enhancements, and content of existing applications.
iii. Implement design of native application using Auto Layout Guide & Constraints in Interface Builder.
iv. Create and update re-usable code libraries to streamline app development cycle.
v. Contribute to all phases of the product development: design, develop, test, maintain and improvise.
- BASIC SKILLS:
i. Good communication and interpersonal skills.
iii. Having experience with Auto Layout Guide.
iv. Strong knowledge of current iOS development languages (Swift & Objective-C).
v. Ability to manage multiple projects at a time.
vi. Flexibility and eagerness to identify, learn, and use new and changing technologies.
v. Self Confident and Enthusiastic.
- Contract length: 12 Month
Job Types: Full-time, Contract, Fresher
Pay: 16,000- ₹22,000 per month
1. Cloud Backend Design along with data storage and backup solutions. 2. Backend APIs and System Design and Integration with IoT Devices, web and mobile applications.
3. Building reusable code and libraries for future use
4. Optimization of the application for maximum speed and scalability
5. Implementation of security and data protection
6. Design and implementation of data storage solutions
Required Skills:
1. Demonstrated history of designing and implementing Cloud-based Microservices Applications using AWS or GCP.
2. 5+ years of hands-on experience using core Java and SpringBoot framework.
3. Good Understanding of Serverless Architecture and Event-Driven Systems.
4. Understanding Product requirements and translating them into technical specifications and development using Agile methodology.
5. Understanding accessibility and security compliance on AWS/GCP and Spring Boot and Flask.
6. Good understanding of Data structures and Algorithms.
7. AWS Skills Required: AWS Lambda, DynamoDB, SNS, SQS, S3, IoT Core, Kinesis Streams, Elastic Beanstalk, EC2, IAM, Elastic Cache, API Gateway.
Good To have:
1. Knowledge of Kubernetes or other container orchestration tools.
2. Python and Flask.
3. Google Cloud Platform and Firebase
Blume Global (HQ California, www.blumeglobal.com) is a disrupter in the supply chain software market and has built the next generation cloud first Digital Supply Chain Platform for fortune 500 companies. Blume Global uses its 25+ years of data insights and global network to help enterprises be more agile, improve service delivery and reduce cost removing significant wastage from their operations.
Role Summary:
As an experienced Analyst, you will:
Primarily making our customers think you are magical by resolving complex problems through your technical and product expertise. When
you learn more about our product suite, you will be able to extend your depth of knowledge on the products you support, as well as
expand to new technology stacks and supply chain domain knowledge. To hone your technical prowess, dig deep into database, data files,
logs, and traces to find the source of any problem. Finally, you will be someone our customers trust. They will depend on you to provide
timely and accurate information to their application issues.
Responsibilities:
• Prior experience of working in Application/Production support environment.
• MySQL knowledge and SQL querying abilities are needed. Skills in Python scripting would be advantageous.
• Troubleshooting and developing new solutions that solve the root cause of customer problems in tickets elevated from our L1 support team. Work Independently in the team.
• Problem Management (identifying recurring incidents, notify L3 for permeant fixes)
• Along with Customer Success Manager, participate in Weekly & Monthly reviews with the customers.
• Writing step-by-step processes, technical solutions, and ticket updates to customers using clear and concise English.
• Study ticket patterns and suggest improvement. Identify areas that can be automated.
• Experience in Application support ticketing tool such as ServiceNow & Jira
• Thorough understanding of SLA Management & Operational reporting.
• Provide value to the Customer in line to Quality, Process Improvements & Other customer centric initiatives.
• ITIL V3 Foundation Certified and through in-Service management processes, Event Management, Incident, Problem Management & Change Management.
Experience: 2 to 5 years
Skills: UI Development, Javascript, ES6, Typescript, D3, HTML5, CSS3, Angular js, React js
Notice Period: Immediate/ 15 days












