
About Wibmo
About
Connect with the team
Similar jobs
About Us :
CLOUDSUFI, a Google Cloud Premier Partner, a Data Science and Product Engineering organization building Products and Solutions for Technology and Enterprise industries. We firmly believe in the power of data to transform businesses and make better decisions. We combine unmatched experience in business processes with cutting edge infrastructure and cloud services. We partner with our customers to monetize their data and make enterprise data dance.
Our Values :
We are a passionate and empathetic team that prioritizes human values. Our purpose is to elevate the quality of lives for our family, customers, partners and the community.
Equal Opportunity Statement :
CLOUDSUFI is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. All qualified candidates receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, and national origin status. We provide equal opportunities in employment, advancement, and all other areas of our workplace.
Role : Architect/Lead-Python(Hands-on experience with at least one SaaS ingestion platform such as Fivetran ,Airbyte,etc. is mandatory.)
Location : Noida, Delhi/NCR(Hybrid)
Experience : 8- 14 years
Education : BTech / BE / MCA / MSc Computer Science
Must have :
- 6+ years of data engineering; at least 3 years working on connector or integration framework development
- Deep Python expertise including PySpark, pyarrow, and an understanding of Spark's execution model (driver vs executor, serialization constraints, partition fan-out)
- Hands-on experience with at least one SaaS ingestion platform Fivetran, Airbyte, Google DTS, AWS Glue connectors, or equivalent at the connector-build level, not just configuration
- Strong understanding of OAuth 2.0 flows (auth code, PKCE, client credentials, JWT), rate limiting strategies (token bucket, leaky bucket, per-endpoint quotas), and incremental sync patterns (cursor, watermark, CDC)
- Experience designing shared connector frameworks reusable auth managers, rate governors, state stores not just per-connector scripts
- Ability to author and own TDDs and PRDs that can be handed to a junior engineer with minimal back-and-forth
Nice to have :
- Prior exposure to Databricks Asset Bundles / Declarative Automation Bundles or Lakeflow pipelines
- Experience with the Databricks Python Data Source API (DBR 15.4 LTS+) extremely rare, so treat practical Spark DSv2 Java/Scala background as equivalent
- GCP DTS or Cloud Data Fusion connector experience
- Knowledge of the specific source systems, particularly Social Ads APIs (Meta, LinkedIn, X) or enterprise SaaS (Salesforce, Oracle)
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
Hiring : Salesforce CPQ Developer
Experience : 6+ years
Shift timings : 7:30PM to 3:30AM
Location : India (Remote)
Key skills: CPQ steel brick , Billing sales cloud , LWC ,Apex integrations, Devops, APL, ETL Tool
Design scalable and efficient Salesforce Sales Cloud solutions that meet best practices and business requirements.
Lead the technical design and implementation of Sales Cloud features, including CPQ, Partner Portal, Lead, Opportunity and Quote Management.
Provide technical leadership and mentorship to development teams. Review and approve technical designs, code, and configurations.
Work with business stakeholders to gather requirements, provide guidance, and ensure that solutions meet their needs. Translate business requirements into technical specifications.
Oversee and guide the development of custom Salesforce applications, including custom objects, workflows, triggers, and LWC/ Apex code.
Ensure data quality, integrity, and security within the Salesforce platform. Implement data migration strategies and manage data integrations.
Establish and enforce Salesforce development standards, best practices, and governance processes. Monitor and optimize the performance of Salesforce solutions, including addressing performance issues and ensuring efficient use of resources.
Stay up-to-date with Salesforce updates and new features. Propose and implement innovative solutions to enhance Salesforce capabilities and improve business processes.
Document design, code consistently throughout the design/development process
Diagnose, resolve, and document system issues to support project team.
Research questions with respect to both maintenance and development activities.
Perform post-migration system review and ongoing support.
Prepare and deliver demonstrations/presentations to client audiences, professional seniors/peers
Adhere to best practices constantly around code/data source control, ticket tracking, etc. during the course of an assignment
Skills/Experience:
Bachelor’s degree in Computer Science, Information Systems, or related field.
6+ years of experience in architecting and designing full stack solutions on the Salesforce Platform.
Must have 3+ years of Experience in architecting, designing and developing Salesforce CPQ (SteelBrick CPQ) and Billing solutions.
Minimum 3+ years of Lightning Framework development experience (Aura & LWC).
CPQ Specialist and Salesforce Platform Developer II certification is required.
Extensive development experience with Apex Classes, Triggers, Visualforce, Lightning, Batch Apex, Salesforce DX, Apex Enterprise Patterns, Apex Mocks, Force.com API, Visual Flows, Platform Events, SOQL, Salesforce APIs, and other programmatic solutions on the Salesforce platform.
Experience in debugging APEX CPU Error, SOQL queries Exceptions, Refactoring code and working with complex implementations involving features like asynchronous processing
Clear insight of Salesforce platform best practices, coding and design guidelines and governor limits.
Experience with Development Tools and Technologies: Visual Studio Code, GIT, and DevOps Setup to automate deployment/releases.
Knowledge of integration architecture as well as third-party integration tools and ETL (Such as Informatica, Workato, Boomi, Mulesoft etc.) with Salesforce
Experience in Agile development, iterative development, and proof of concepts (POCs).
Excellent written and verbal communication skills with ability to lead technical projects and manage multiple priorities in a fast-paced environment.
Experience:
○ 2-4 years of hands-on experience with Microsoft Power Automate (Flow).
○ Experience with Power Apps, Power BI, and Power Platform technologies.
○ Experience in integrating REST APIs, SOAP APIs, and custom connectors.
○ Proficiency in using tools like Microsoft SharePoint, Azure, and Dataverse.
○ Familiarity with Microsoft 365 apps like Teams, Outlook, and Excel.
● Technical Skills:
○ Knowledge of JSON, OData, HTML, JavaScript, and other web-based technologies.
○ Strong understanding of automation, data integration, and process optimization.
○ Experience with D365 (Dynamics 365) and Azure Logic Apps is a plus.
○ Proficient in troubleshooting, problem-solving, and debugging automation workflows.
● Soft Skills:
○ Excellent communication skills to liaise with stakeholders and technical teams.
○ Strong analytical and problem-solving abilities.
○ Self-motivated and capable of working independently as well as part of a team.
Educational Qualifications:
● Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent practical
experience).
Good to have Qualifications:
● Microsoft Certified: Power Platform certifications (e.g., Power Platform Functional Consultant, Power Automate RPA
Developer) would be advantageous.
● Experience with Agile or Scrum methodologies.
Responsibilities
- Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
- Conduct meetings with Clients to gather the requirements and understand the scope.
- Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
- Define business problems, identify solutions, provide analysis and insights from the client's data.
- 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
- Ensure solutions are delivered error free and submitted on time.
- Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
- Provide end-user training and technical assistance to maximize utilization of tools.
- Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.
Requirements
- Bachelor's degree in Engineering.
- 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
- 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
- Strong project and time management skills with ability to multitask and prioritize workload.
- Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
- Proficiency in Hadoop / Hive.
- Experience Pentaho ETL, Pentaho Visualization API, Tableau.
- Hands on experience of working with Big data sets (Data sets with millions of records).
- Strong technical and Management experience.
Desired Skills and Experience
- NET,ASP.NET
About SteelEye
SteelEye is a fast growing FinTech company based in London and has offices in Bangalore and Paris, that offers a data platform to help financial institutions such as Investment Banks, Hedge Funds, Brokerage Firms, Asset Management Firms to comply with financial regulations in the European Union. Our clients can aggregate, search, surveillance and report on trade, communications and market data. SteelEye also enables customers to gain powerful insights from their data, helping them to trade with greater efficiency and profitability. The company has a highly experienced management team and a strong board, who have decades of technology and management experience and worked in senior positions at many leading international financial businesses. We are looking to hire a seasoned SRE to join us as we start on our next phase of growth. We have a culture of openness, collaboration, and the passion to get things done whilst appreciating the importance of a good work life balance.
Being part of a start-up can be equally exciting as it is challenging. You will be part of the SteelEye team not just because of your talent but also because of your entrepreneurial flare which we thrive on at SteelEye. This means we want you to be curious, contribute, ask questions and share ideas. We encourage you to get involved in helping shape our business.
What you’ll do
- Deliver plugins for our Python-based ETL pipelines.
- Deliver Python microservices for provisioning and managing cloud infrastructure.
- Implement algorithms to analyse large data sets.
- Draft design documents that translate requirements into code.
- Deal with challenges associated with handling large volumes of data.
- Assume responsibilities from technical design through technical client support.
- Manage expectations with internal stakeholders and context-switch in a fast paced environment.
- Thrive in an environment that uses AWS and Elasticsearch extensively.
- Keep abreast of technology and contribute to the engineering strategy.
- Champion best development practices and provide mentorship.
What we’re looking for
- Experience in o Python 3.
- o Python libraries used for data (such as pandas, numpy).
- o AWS.
- o Elasticsearch.
- o Performance tuning.
- o Object Oriented Design and Modelling.
- o Delivering complex software, ideally in a FinTech setting.
- o CI/CD tools.
- Knowledge of design patterns.
- Sharp analytical and problem-solving skills.
- Strong sense of ownership.
- Demonstrable desire to learn and grow.
- Excellent written and oral communication skills.
- Mature collaboration and mentoring abilities.
About SteelEye Culture
- Work from home until you are vaccinated against COVID-19
- Top of the line health insurance • Order discounted meals every day from a dedicated portal
- Fair and simple salary structure
- 30+ holidays in a year
- Fresh fruits every day
- Centrally located. 5 mins to the nearest metro station (MG Road)
- Measured on output and not input
- 7 years of hands-on experience on database development
- Very strong and hands-on in Oracle Database and PL/SQL development
- Hands on experience in designing solutions and developing for data migration projects using Oracle PL/SQL
- Experience with Oracle Argus Safety, ArisG and other Safety/Clinical systems
- Working experience in development of ETL process, DB Design and Data Structures
- Excellent knowledge of Relational Databases, Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based) , Object type, Stored Procedures, Functions, Packages and Triggers, Dynamic SQL, Set transaction, pl/sql Cursor variables with Ref Cursor
Excellent written and verbal communication skills
Facilitation and Business Analysis skills.
Estimation and Project Planning skills for enterprise level applications.
Excellent communication and technical leadership skills.
Job Role – SDE2
Duration – 12 months
Location – HYD
Key Skills:
- 3- 5 years
- Understanding databases and building reports
- Ideal candidate should have experience in Cosmos, Kusto and powerbi
- Mandatory experience in some Database query language(such as SQL)









