
We Are Hiring For Tally Customer Support
Job Role: Tele calling & Sales Supporting
Candidates Must Have Good Communication Skills
Candidates Should Be Able To Speak English Fluently
Qualification: B.com Graduates Can Apply
Job Location: Shimoga, Mangalore, Karnataka
Experience: 1 - 2 year experience is required

About HIPPO CLOUD TECHNOLOGIES
About
Similar jobs
Highlights - Current location of candidate should be Bangalore
Total Exp - 6-12yrs
Joining Time period - Within 30 days
GCP Bigquery expert, GCP Certified
About Us
CLOUDSUFI, a Google Cloud Premier Partner, is a global leading provider of data-driven digital transformation across cloud-based enterprises. With a global presence and focus on Software & Platforms, Life sciences and Healthcare, Retail, CPG, financial services and supply chain, CLOUDSUFI is positioned to meet customers where they are in their data monetization journey.
Our Values
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, colour, religion, gender, gender identity or expression, sexual orientation and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace. Please explore more at https://www.cloudsufi.com/
Job Summary
We are seeking a highly skilled and motivated Data Engineer to join our Development POD for the Integration Project. The ideal candidate will be responsible for designing, building, and maintaining robust data pipelines to ingest, clean, transform, and integrate diverse public datasets into our knowledge graph. This role requires a strong understanding of Cloud Platform (GCP) services, data engineering best practices, and a commitment to data quality and scalability.
Key Responsibilities
ETL Development: Design, develop, and optimize data ingestion, cleaning, and transformation pipelines for various data sources (e.g., CSV, API, XLS, JSON, SDMX) using Cloud Platform services (Cloud Run, Dataflow) and Python.
Schema Mapping & Modeling: Work with LLM-based auto-schematization tools to map source data to our schema.org vocabulary, defining appropriate Statistical Variables (SVs) and generating MCF/TMCF files.
Entity Resolution & ID Generation: Implement processes for accurately matching new entities with existing IDs or generating unique, standardized IDs for new entities.
Knowledge Graph Integration: Integrate transformed data into the Knowledge Graph, ensuring proper versioning and adherence to existing standards.
API Development: Develop and enhance REST and SPARQL APIs via Apigee to enable efficient access to integrated data for internal and external stakeholders.
Data Validation & Quality Assurance: Implement comprehensive data validation and quality checks (statistical, schema, anomaly detection) to ensure data integrity, accuracy, and freshness. Troubleshoot and resolve data import errors.
Automation & Optimization: Collaborate with the Automation POD to leverage and integrate intelligent assets for data identification, profiling, cleaning, schema mapping, and validation, aiming for significant reduction in manual effort.
Collaboration: Work closely with cross-functional teams, including Managed Service POD, Automation POD, and relevant stakeholders.
Qualifications and Skills
Education: Bachelor's or Master's degree in Computer Science, Data Engineering, Information Technology, or a related quantitative field.
Experience: 6+ years of proven experience as a Data Engineer, with a strong portfolio of successfully implemented data pipelines.
Programming Languages: Proficiency in Python for data manipulation, scripting, and pipeline development.
Cloud Platforms and Tools: Expertise in Google Cloud Platform (GCP) services, including Cloud Storage, Cloud SQL, Cloud Run, Dataflow, Pub/Sub, BigQuery, and Apigee. Proficiency with Git-based version control.
Core Competencies:
Must Have - SQL, Python, BigQuery, (GCP DataFlow / Apache Beam), Google Cloud Storage (GCS)
Must Have - GCP Certification
Must Have - Proven ability in comprehensive data wrangling, cleaning, and transforming complex datasets from various formats (e.g., API, CSV, XLS, JSON)
Secondary Skills - SPARQL, Schema.org, Apigee, CI/CD (Cloud Build), GCP, Cloud Data Fusion, Data Modelling
Solid understanding of data modeling, schema design, and knowledge graph concepts (e.g., Schema.org, RDF, SPARQL, JSON-LD).
Experience with data validation techniques and tools.
Familiarity with CI/CD practices and the ability to work in an Agile framework.
Strong problem-solving skills and keen attention to detail.
Job Title: Certified MuleSoft Developer
Location: Gurgaon (On-site)
Experience: 4 to 8 Years
Company: Watsoo Express Pvt. Ltd.
Education: BE/BTech/MCA only
Joining: Immediate Joiners Only
Job Summary:
Watsoo Express Pvt. Ltd. is hiring a Certified MuleSoft Developer for an on-site role in Gurgaon. We are looking for professionals with strong technical expertise in MuleSoft integrations, who are ready to join immediately and contribute to high-impact, enterprise-level digital transformation projects.
Key Responsibilities:
- Develop and maintain APIs and integration flows using MuleSoft
- Work with Mule 4, DataWeave, and Anypoint Studio to deliver scalable solutions
- Design API specifications using RAML and follow API-led connectivity approach
- Collaborate with internal teams to translate business requirements into integration solutions
- Ensure performance tuning, monitoring, and troubleshooting of APIs
- Follow best practices for deployment, CI/CD, and API governance
- Support production releases and resolve critical issues promptly
Required Skills & Qualifications:
- MuleSoft Certified Developer (or Integration Specialist)
- 4 to 8 years of MuleSoft integration experience
- Expertise in Mule 4, API Gateway, connectors, and Anypoint Platform
- Proficiency in DataWeave transformations and flow orchestration
- Hands-on experience with CI/CD pipelines and version control
- Strong understanding of integration patterns and enterprise architecture
- Excellent communication and stakeholder management skills
- Immediate availability is a must
Preferred Skills:
- Integration experience with platforms like Salesforce, SAP, or Oracle
- Familiarity with Agile methodologies and tools like JIRA
- Experience using logging/monitoring tools (e.g., Splunk, ELK)
What We Offer:
- Work on innovative projects in the logistics-tech space
- Collaborative, tech-focused team environment
- On-site role with full-time employment at our Gurgaon office
years MS SQL Server Administration experience required
∙Provide 24x7 support for critical production systems.
∙ Provide technical leadership for the service component or the product.
∙IT Operations with strong understanding of database structures, theories, principles, and best practices.
∙Experience with Performance Tuning and Optimization (PTO), using native monitoring and third-party tools
∙Experience with backups, restores and recovery processes
∙Knowledge of High Availability (HA) and Disaster Recovery (DR) options for SQL Server
∙Assist developers with complex query tuning and schema refinement.
∙Experience in handling database corruption
∙Manage Encrypted database, TDE and SSL certificate
∙Manage PaaS and IaaS environment
∙Experience with SSIS packages, SSRS and SSAS
∙Experience working with Windows server, including Active Directory
Role & responsibilities
- Develop new Leads of Industries who can install solar
- Develop strong relationships with the different stakeholders in these industries
- Generate proposals, offers
- Conduct negotiations with prospective customers
- Close orders with prospective customers
- Work with the Products team to better mold the product as per the customers requirements
- Hand hold a customer in their transition to renewable energy
Preferred candidate profile
- Past work experience as a sales executive / manager in an EPC company / project developer
- First hand knowledge of Rooftop Solar
- Strong Industry connects to be able to work independently
- Bachelors Degree in associated field
- Comfortable with travelling over night to visit the site if required
Perks and benefits
- Pay above industry standards
- Very attractive Performance linked Bonus
- Attractive ESOPs
- 100% Ownership of the task
- Flexible, employee centric work culture
- Massive Growth Potential
Responsibilities and duties
- Create content marketing campaigns to drive leads and subscribers Use SEO best practices to generate traffic to our site
- Regularly produce various content types, including email, social media posts, blogs, and white papers
- Actively manage and promote our blog, and pitch articles to relevant third-party platforms
- Edit content produced by other members of the team
- Analyze content marketing metrics and makes changes as needed
- Collaborate with other departments to create innovative content ideas
Job profile- Oracle _ Solution Architect
Experience – 6+ years
Location- Bangalore/ Chennai/ Pune/ Noida
Salary- 30 LPA
Qualification- Any
Job Location- Bangalore/ Chennai/ Pune/ Noida
Key Skills- Oracle ERP cloud solutions, Rollout or implementation, Oracle EBS
Roles & Responsibilities-
- Solution architect experience in the delivery of Oracle ERP cloud solutions, leading the design and implementation of Oracle ERP Cloud Solutions across a range of client industries and leading customer engagements and advising on Oracle fusion related topics.
- Strong hands-on experience with Oracle Cloud ERP solution architectures, design, rollout and implementation leadership.
- 8 years minimum experience in Oracle EBS and Oracle ERP Cloud with the majority of recent experience in Oracle Fusion (Cloud) applications.
- Minimum 4-6 years of experience in Oracle cloud solution implementation. Recent experience in a lead role in at least 2 full ERP Cloud implementations
Full Time / Part Time- Full Time
Remote / on-site- WFO
Job Location: Pune/Bangalore/ Hyderabad/ Indore
- Very good knowledge of MuleSoft components.
- Prior work experience in setting up a COE using MuleSoft Integration Software.
- Good understanding of various integration patterns.
- Ability to deliver projects independently with little or no supervision.
- Previous experience working in a multi-geographic team.
- Previous experience with best programming practices.
- Good written and oral communication skills – English.
| REST & SOAP API ,Streaming API, Communities, Lightning , Omni Channel, Knowledge Base, Social Studio, CTI, Console Javascript) |
| • Omni Channel, CTI, Console Javascript, Communities, Lightning( Aura , LWC Components), Console Javascript |
| • Knowledge Base, Social Studio, CTI, Console Javascript, REST & SOAP integrations, Data Migrations. |
2. Engage with prospects to understand their unique and specific pain points and produce compelling business cases to meet their needs
3. Develop and implement a scalable sales process from prospecting/demand generation through contract execution
4. Work with internal team members to ensure successful onboarding and implementation for new clients
5. Collaborate with internal product teams and provide feedback from the field to help shape future development









