
If interested please share your resume at ayushi.dwivedi at cloudsufi.com
Note - This role is remote but with quarterly visit to Noida office (1 week in a qarter) if you are ok for that then pls share your resume.
Data Engineer
Position Type: Full-time
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services, and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 3+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Preferred Qualifications:
Experience with LLM-based tools or concepts for data automation (e.g., auto-schematization).
Familiarity with similar large-scale public dataset integration initiatives.
Experience with multilingual data integration.

About CLOUDSUFI
About
We exist to eliminate the gap between “Human Intuition” and “Data-Backed Decisions”
Data is the new oxygen, and we believe no organization can live without it. We partner with our customers to get to the core of their problems, enable the data supply chain and help them monetize their data. We make enterprise data dance!
Our work elevates the quality of lives for our family, customers, partners and the community.
The human values that we display in all our interactions are of:
Passion – we are committed in heart and head
Integrity – we are real, honest and, fair
Empathy – we understand business isn’t just B2B, or B2C, it is H2H i.e. Human to Human
Boldness – we have the courage to think and do differently
The CLOUDSUFI Foundation embraces the power of legacy and wisdom of those who have helped laid the foundation for all of us, our seniors. We believe in their abilities and we pledge to equip them, to provide them jobs, and to bring them sufi joy.
Tech stack
Connect with the team
Similar jobs
Roles & Responsibilities of a Sr. Software Engineer:
- Collaborating with Clients, Product Managers, Architects, & Analysts to understand and review requirements & design.
- Developing and enhancing effective programs & data structures to meet objectives of the end product.
- Participating effectively in relevant aspects of software development life cycle (SDLC) and Agile Methodology including planning, design, development, testing, reviews and demonstrations.
- Investigating and resolving application issues as needed while packaging, configuring and deploying software.
- Preparing functional and technical specification documents which address the requirements.
- Researching, recommending, and introducing new technologies.
- Preparing Unit Test Cases and performing Unit Testing to confirm that the final product meets requirements and customer needs.
- Performing Regression Testing.
- Participating in Business planning, IT strategy.
- Graduation Degree in B.E / B.Tech or MCA
- Professional programming experience in developing complete stack – Both server and client side. ASP.Net Core, Web APIs, MVC and C#.
- Hands-on experience in PL/SQL, writing complex store procedure
- Understanding of various phases of SDLC
- Experience in performing unit testing and fixing bugs
- Excellent communication skills.
- Exposure to SAAS based product
- Experience in BFSI, Treasury, Trade Finance
Responsibilities:
● Develop and execute test cases, scripts, plans, and procedures (manual and automated) to ensure the quality of software products
● Identify, record, document thoroughly, and track bugs
● Perform thorough regression testing when bugs are resolved
● Collaborate with cross-functional teams to ensure quality throughout the software development life cycle
● Investigate the causes of non-conforming software and provide solutions
● Team handling capabilities
Requirement:
● Bachelor’s degree in computer science, Engineering, or related field
● 6+ years of experience in software testing
● Strong knowledge of software QA methodologies, tools, and processes.
● Experience in writing clear, concise, and comprehensive test plans and test cases
● Hands-on experience with both white box and black box testing
● Experience working in an Agile/Scrum development process using tools like JIRA.
● Hands-on Experience in Behavior-Driven Framework, e.g. Cucumber
● Experience in web automation testing using tools like selenium and API automation tools like RestAssured, karate etc.
● Experience with performance and/or security testing is a plus
● Strong analytical and problem-solving skills
● Excellent communication and teamwork skills
Position Description
We are looking for a highly motivated, hands-on Sr. Database/Data Warehouse Data Analytics developer to work at our Bangalore, India location. Ideal candidate will have solid software technology background with capability in the making and supporting of robust, secure, and multi-platform financial applications to contribute to Fully Paid Lending (FPL) project. The successful candidate will be a proficient and productive developer, a team leader, have good communication skills, and demonstrate ownership.
Responsibilities
- Produce service metrics, analyze trends, and identify opportunities to improve the level of service and reduce cost as appropriate.
- Responsible for design, development and maintenance of database schema and objects throughout the lifecycle of the applications.
- Supporting implemented solutions by monitoring and tuning queries and data loads, addressing user questions concerning data integrity, monitoring performance, and communicating functional and technical issues.
- Helping the team by taking care of production releases.
- Troubleshoot data issues and work with data providers for resolution.
- Closely work with business and applications teams in implementing the right design and solution for the business applications.
- Build reporting solutions for WM Risk Applications.
- Work as part of a banking Agile Squad / Fleet.
- Perform proof of concepts in new areas of development.
- Support continuous improvement of automated systems.
- Participate in all aspects of SDLC (analysis, design, coding, testing and implementation).
Required Skill
- 5 to 7 Years of strong database (SQL) Knowledge, ETL (Informatica PowerCenter), Unix Shell Scripting.
- Database (preferably Teradata) knowledge, database design, performance tuning, writing complex DB programs etc.
- Demonstrate proficient skills in analysis and resolution of application performance problems.
- Database fundamentals; relational and Datawarehouse concepts.
- Should be able to lead a team of 2-3 members and guide them in their day to work technically and functionally.
- Ensure designs, code and processes are optimized for performance, scalability, security, reliability, and maintainability.
- Understanding of requirements of large enterprise applications (security, entitlements, etc.)
- Provide technical leadership throughout the design process and guidance with regards to practices, procedures, and techniques. Serve as a guide and mentor for junior level Software Development Engineers
- Exposure to JIRA or other ALM tools to create a productive, high-quality development environment.
- Proven experience in working within an Agile framework.
- Strong problem-solving skills and the ability to produce high quality work independently and work well in a team.
- Excellent communication skills (written, interpersonal, presentation), with the ability to easily and effectively interact and negotiate with business stakeholders.
- Ability and strong desire to learn new languages, frameworks, tools, and platforms quickly.
- Growth mindset, personal excellence, collaborative spirit
Good to have skills.
- Prior work experience with Azure or other cloud platforms such as Google Cloud, AWS, etc.
- Exposure to programming languages python/R/ java and experience with implementing Data analytics projects.
- Experience in Git and development workflows.
- Prior experience in Banking and Financial domain.
- Exposure to security-based lending is a plus.
- Experience with Reporting/BI Tools is a plus.
• Education:- Under Graduate / Any Graduate (BSc. Prefer)
• Working days:- 05
• Working Hours:- 9.5
• Shift timings:- 5:30pm to 3:00am IST
• Location:- Sattadhar Area or Prahladnagar Area
Job Description:-
• Mini 6months to 5yrs of experience in the accounts receivable process of medical billing.
• Required to call US insurance companies and follow up on outstanding claims.
• An ideal candidate would have excellent communication and analytical skills.
• Good communication skills are essential.
• Coordinate with the doctors office via mail/call.
• Coordinate with Patient via mail/call.
• Follow the Patient Care Plan
• Training will be provided
• Microsoft Office and Excel knowledge
Basic Requirements:-
• Excellent Spoken and written English skills is must
• Candidate should be able to join Immediately/Post Notice Period
• US Voice Process - Healthcare
Interview Process:-
(Whole process takes 3hrs)
• Initial screening- Aptitude Test, IQ test and Essay writing (1.5 Hrs)
• HR Round
• Operations Round.
Job description
- The BASE24 Developer at Concerto will independently own the responsibility of Development and Support
- A quick learner, proactive to changing requirements and an excellent team player is what we are looking for
Responsibilities
- Should have good understanding on BASE24 application architecture and design
- Excellent development and troubleshooting skills
- Good team player who should be able to drive internal teams and perform effective customer management
- Ensure the availability of the environment during ST,SIT, UAT
- Debug and fix system failure errors Run the Install Plans
- Review and test the install plan
- Rebuild of environments
- Ensure environment related issues are addressed on time
Essential Skills
- Expertise on ATM & POS functionality, Transaction Switching and EMV
- Understanding of Base24 source codes, change and release management, application debugging and problem resolution
- Excellent analytical and troubleshooting skills experience on Base24 ATM, POS, Schemes In-depth knowledge of Base24 Switch application
- Understanding of TAL, COBOL, S-COBOL
- Proficiency in written and spoken English Strong communication skills
Skills good-to-have:
Working experience on project Implementation in a bank or processor environment using Base24
Knowledge of transaction testing tools or simulators
Startup/shutdown maintenance of Tandem system
Deployment of code into ST, SIT and UAT environments.
Agile and scrum methodologies Knowledge of source code version control tool
Role: Software Development - Other
Industry Type: IT Services & Consulting
Department: Engineering - Software & QA
Employment Type: Full Time, Permanent
Role Category: Software Development
Must have the capacity to work independently or in a team.
Contribute in all phases of the development lifecycle
Write well designed, testable, efficient code
Ensure designs are in compliance with specifications
Prepare and produce releases of software components
Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
2. Can demonstrate an understanding of the
following concepts
● Common Linux Utilities
● Server/Application Architecture and
Security Hardening
● Virtualization
● Load Balancing
● Networking Concepts
3. Install, configure, manage and maintain all Linux
server-based applications
4. Strong Knowledge in redhat Clustering
5. Ensure all Linux based systems have the
appropriate patches installed, includes but is not
limited to, security, applications, and OS
6. Ensure all Linux based systems are running
current versions of antivirus software with
current virus definition files
7. Follow the Policies and procedures at the client
location
Deepak Parghi
Senior Manager
UpMan Placements Pvt. Ltd












