
About Wibmo
About
Connect with the team
Similar jobs
Experience:
○ 2-4 years of hands-on experience with Microsoft Power Automate (Flow).
○ Experience with Power Apps, Power BI, and Power Platform technologies.
○ Experience in integrating REST APIs, SOAP APIs, and custom connectors.
○ Proficiency in using tools like Microsoft SharePoint, Azure, and Dataverse.
○ Familiarity with Microsoft 365 apps like Teams, Outlook, and Excel.
● Technical Skills:
○ Knowledge of JSON, OData, HTML, JavaScript, and other web-based technologies.
○ Strong understanding of automation, data integration, and process optimization.
○ Experience with D365 (Dynamics 365) and Azure Logic Apps is a plus.
○ Proficient in troubleshooting, problem-solving, and debugging automation workflows.
● Soft Skills:
○ Excellent communication skills to liaise with stakeholders and technical teams.
○ Strong analytical and problem-solving abilities.
○ Self-motivated and capable of working independently as well as part of a team.
Educational Qualifications:
● Bachelor's Degree in Computer Science, Information Technology, Engineering, or a related field (or equivalent practical
experience).
Good to have Qualifications:
● Microsoft Certified: Power Platform certifications (e.g., Power Platform Functional Consultant, Power Automate RPA
Developer) would be advantageous.
● Experience with Agile or Scrum methodologies.

We are looking out for a technically driven "Full-Stack Engineer" for one of our premium client
COMPANY DESCRIPTION:
Required Skills
• Hands-on experience with NodeJS, React, Redux, & Docker
• Great to have understanding about Kubernetes, Postgres and AWS (SQS, Lambda, S3)
• Experience implementing micro service technology
• Experience working with Python and Pandas, used for data manipulation is a plus
• Experience with Power BI and its APIs is a plus
• Experience with building and maintaining large data sets
• Ability to work across structured, semi-structured and unstructured data, extracting information
and identifying linkages across disparate data sets
• Understanding of information security principles
• Ability to understand complex systems and solve challenging problems
• Ability to clearly communicate complex solutions
• Ability to learn new technologies quickly
• Comfortable in a fast paced small team environment
• Open to work with global team structure, flexible and efficient
• Ability and flexibility to manage multiple assignments in a dynamic, complex and fast-paced
environment
• High level of attention to detail
• Commercial client-facing project experience is a plus
• Business-level language skills and fluency in English
Job description Position: Data Engineer Experience: 6+ years Work Mode: Work from Office Location: Bangalore Please note: This position is focused on development rather than migration. Experience in Nifi or Tibco is mandatory.Mandatory Skills: ETL, DevOps platform, Nifi or Tibco We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in developing and maintaining our data infrastructure and ensuring the smooth operation of our data platforms. The ideal candidate should have a strong background in advanced data engineering, scripting languages, cloud and big data technologies, ETL tools, and database structures.
Responsibilities: • Utilize advanced data engineering techniques, including ETL (Extract, Transform, Load), SQL, and other advanced data manipulation techniques. • Develop and maintain data-oriented scripting using languages such as Python. • Create and manage data structures to ensure efficient and accurate data storage and retrieval. • Work with cloud and big data technologies, specifically AWS and Azure stack, to process and analyze large volumes of data. • Utilize ETL tools such as Nifi and Tibco to extract, transform, and load data into various systems. • Have hands-on experience with database structures, particularly MSSQL and Vertica, to optimize data storage and retrieval. • Manage and maintain the operations of data platforms, ensuring data availability, reliability, and security. • Collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. • Stay up-to-date with the latest industry trends and advancements in data engineering and suggest improvements to enhance our data infrastructure.
Requirements: • A minimum of 6 years of relevant experience as a Data Engineer. • Proficiency in ETL, SQL, and other advanced data engineering techniques. • Strong programming skills in scripting languages such as Python. • Experience in creating and maintaining data structures for efficient data storage and retrieval. • Familiarity with cloud and big data technologies, specifically AWS and Azure stack. • Hands-on experience with ETL tools, particularly Nifi and Tibco. • In-depth knowledge of database structures, including MSSQL and Vertica. • Proven experience in managing and operating data platforms. • Strong problem-solving and analytical skills with the ability to handle complex data challenges. • Excellent communication and collaboration skills to work effectively in a team environment. • Self-motivated with a strong drive for learning and keeping up-to-date with the latest industry trends.


You will:
- Write excellent production code and tests and help others improve in code-reviews
- Analyze high-level requirements to design, document, estimate, and build systems
- Coordinate across teams to identify, resolve, mitigate and prevent technical issues
- Coach and mentor engineers within the team to develop their skills and abilities
- Continuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processes
You have:
For (Fullstack):
- 2 - 10 Years of experience
- Strong with DS & Algorithms
- Hands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.
- Experience with AWS.
For (Geo Team):
- 4 - 10 years of experience
- Experience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etc
- Experience using object-oriented languages (Java, Python)
- Experience in working with different AWS technologies.
- Experience in software design, architecture and development.
- Excellent competencies in data structures & algorithms.
For (Backend):
- 2 - 10 years of experience
- Hands on product development experience using Java/ C++/Python
- Experience with AWS,SQL,GIT
- Strong with Data structures and Algorithms
Additional nice to have skills/certifications:
For Java skill set:
Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, Perl
For Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, Redshift
For (Data Engineering):
- 2 - 10 years of experience
- Experience with object-oriented/object function scripting languages: Python.
- Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, Glue
- Must be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)
- Experience in big data technologies like Hadoop, Map Reduce, Spark, etc
- Experience with Amazon Web Services and Dockers

Outline your part time freelance developer career with ZettaLogix! Enjoy the flexibility of remote work with the stability of excellent remuneration. We are looking for a freelance developers for long term assignments, who can give us 12 to 24 hours per week. (US SHIFT)
Responsibilities,
We are looking for a Senior Software Developer who will be responsible for:-
Design, code, test and manage various applications.
Follow outlined standards of quality related to code and systems.
Manage multiple projects simultaneously, and be able to address their specific needs and requirements at a moment's notice
Understand business needs that drive project features & functions and provide internal consultation.
Work closely with project managers to deploy project requirements
Has a true passion for design and technology.
Clear and thoughtful communication is another must-have for successful collaboration
Strong problem-solving and analytical skills
Fluent in English, both verbal and non-verbal
The ideal candidate will be responsible for conceptualizing and executing clear, high quality code to develop the best software. You will test your code, identify errors, and iterate to ensure quality code.
Requirements
Qualifications • Education: Bachelor's Degree in IT, Computer Science, Computer Programming, or a similar field •
5+ years of professional experience in any one of the below mentioned software development.
JAVA
.NET
DEVOPS
SALESFORCE
RPA ( AUTOMATION ANYWHERE, BLUEPRISM, UIPATH)
REACT
ANGULAR
ORACLE PL/SQL DEVELOPER
DBA ( SQL, ORACLE)
AZURE CLOUD (INFRASTRUCTURE , DEVELOPER)
AWS CLOUD ( INFRASTRUCTURE, DEVELOPER)
MANUAL QA
REPORT TOOLS ( CONGO’S, TABLEUAU, SSRS)
POWER BI
SAS CLINICAL
NETSUITE (DEVELOPER, FUNCTIONAL)
INFORMATICA ETL
TALENT ETL
DATA STAGE ETL
TERADATA ETL
BIGDATA
DATA ENGINEER
PYTHON
C++ DEVELOPER
EMBEDDED SYSTEMS
The salary would be determined based on experience and credentials.(25K-80K-per month)
Age- Max 35 Years
Note- This is a Freelance role. Candidate is required to give 2 or 4 hours per day (US shift) should have min 5 years exp. in the relevant field.

As a Software Engineer at Quince, you'll be responsible for designing and building scalable infrastructure and build applications to solve some very interesting problems in the logistics and finance tech space.
Responsibilities:
- Design and architect solutions on the cloud for various business problems with workflow efficiency and scale in mind.
- Be on the forefront with the business team to learn, understand, identify and translate function requirements into technical opportunities.
- End-to-end ownership - from scoping the requirements to the final delivery of the solution with keen eye to details and quality.
- Build and improve logistics components for this innovative M2C supply-chain model.
- Build and maintain scalable ETL data pipelines.
Requirements:
- Bachelors/Masters/PhD in Computer Science or closely related subject.
- 1-5 years of experience in building software solutions.
- Good at data structures and their practical applications.
- Proficiency in Kotlin, Java, Python.
- Experience in deploying and maintaining applications on cloud platforms (Ex: AWS, Google cloud).
- Proficiency with SQL and databases - relational and/or nosql (Snowflakes, AWS RedShift, etc).
- Experience with messaging middleware such as Kafka is good to have.



Position Engineering Lead (Platforms) / Product
Architect
Location Bangalore
About Us
The Arth Group comprises companies that operate in the areas of Technology Solutions, Software
Products & Services, Rural Services, and Digital Healthcare. The technology organization serves
as a shared services organization supporting all the group companies.
Job Description Summary
The Engineering Lead (Platforms) / Product Architect will be responsible for leading the
engineering initiatives for the technology platforms being built by the group companies. The
equivalent names for this position are Platform Architect, Product Architect, Software Engineering
Lead, and Product Engineering Lead.
Core Responsibilities
1 Architecture, high level design and detailed design of multiple software platforms.
2 Hands-on development for prototyping and initial releases of software, working closely with
product manager to execute on the product backlog
3 Leading and guiding a technical team of software developers for major releases, Code
reviews and overall responsibility for Tech Quality Assurance (TQA)
4 Technology risk identification and mitigation, new tech evaluation
Example Deliverables
1 Architecture / Design blueprints for both development and deployment
2 Working software – e.g. mobile applications, server applications, APIs, database applications,
integration services
3 Engineering / build plan for feature development, working platform features as per the product
backlog, fixes to any issues reported, enhancements to any features as per user requirements
4 New technology evaluation report
Qualification & Experience
1 Bachelor’s / Masters’ degree from a reputed university
2 Engineering / Architecture certifications (e.g. TOGAF) would be a plus
3 Years of experience: 15-20 years.
4 Experience of Fintech organization or Financial services expertise is a plus.
Technical Competencies
Proven experience in architecting and building software platforms, in particular cloud based
multi-tenant, SaaS architecture
Software development expertise
Back end technologies e.g. Java, Python, PHP, C# .Net, JavaScript (Node), etc.
Front end technologies e.g. HTML/CSS and frameworks e.g. Bootstrap, Angular,
React, Express, etc.
Mobile applications e.g. native (Android, iOS) and hybrid
Database technologies e.g. MySQL, Oracle, MS SQL Server, MongoDB, etc.
Cloud technologies e.g. Web Services, REST APIs, API gateways, micro services,
multi-tenant SaaS applications, usage of Cloud AI services
3 Experience in agile software development
4 Experience in low code platforms and prototyping
5 Expertise in data management & analytics, data visualization, dashboard development e.g.
using Microsoft Power BI, Tableau and similar platforms
Behavioural Competencies
1 Ability to work both as an individual contributor as well as part of a larger team
2 Leadership ability – mentoring and guiding a team of developers, reviewing their software
deliverables, and helping them address any issues
3 Ability to work against stringent deadlines and as per a clearly defined release roadmap and
plan
4 Possessing a positive, can do attitude – “any problem can be solved”
Reporting and Business Intelligence
Minimum of 8 to 10 years of experience in Report building and Business Intelligence with
Profound knowledge on Oracle BI Publisher and BI Analytics with Oracle Fusion HCM domain especially in Core HR, Payroll and Absences modules
Primary skill:
- Experience generating BI publisher reports, dashboards and drill-down reports
- BI Publisher (RTF design/ eText/ Scheduling/ Parameter Handling/Bursting/backup and migration of reports to different pods)
- Design Oracle BI Publisher Data Models, Data sets and Templates/sub-templates
- Strong PL/SQL and Advanced PL/SQL experience
- Solid understanding of performance tuning best practices and experience improving end-to-end processing times
- Knowledge on the data models related to Fusion HCM (Core HR, Absence and Payroll modules)
Nice to have skills:
- Experience generating BI publisher reports, dashboards and drill-down reports
- Strong OBI Administrator activity
- Create mobile applications with Oracle Business Intelligence Mobile App Designer
- Use BI Mobile to access BI Content
- Create and modify Interactive Dashboards
- Knowledge of additional reporting tools viz. HCM Extract / OTBI / Analysis Dashboard would be an added advantage
Behavioural and Process Skills
- Experience working in Agile teams
- Self-motivated and self-directed abilities to prioritize and execute tasks in a high-pressure environment with "time-critical" deadlines
- Proven analytical, evaluative, and problem-solving abilities
- Possesses a team and customer service provision orientation
This role is for 1 month where the person will be working from Client site in Paris to understand the system architecture and documenting them. Contract extension for this role will be purely on the performance of individual.
Since the requirement is immediate and critical, we need someone who can join us soon and travel to Paris in December
- Hands on experience handling multiple data sources/datasets
- experience in data/BI architect role
- Expert on SSIS, SSRS, SSAS
- Should have knowledge writing MDX queries
- Technical document preparation
- Should have excellent communication
- Process oriented
- Strong project management
- Should be able to think Out of the Box and provide ideas to have better solutions
- Outstanding team player with positive attitude
We are urgently looking Azure Data Factory Engineers with 3 to 7 years of overall experience, having good experience on Azure Data Factory and SSIS.
Requirements:
- Qualification: Engineering Graduate.
- Years of Experience: 2+ yrs of relevant experience on Azure Data Factory & SSIS.
- Strong understanding of ETL/ELT concepts.
- Hands on experience on Azure Data Factory.
- Exposure of Cloud technologies, standards, and platforms.
- Sound knowledge of ADF pipeline, configuration, parameters, variables, Integration services runtime (part of Data Factory).
- Strong Experience on SSIS.
- Sound knowledge of implementing slowly changing dimensions and transactional, fact less, snapshot fact tables.
- Azure Data Warehouse.
- Knowledge on Azure components especially file storage and Azure Sql, Blob storage and Data Lake Gen2 storage (just used as a file system on the blob).
- Azure Data Migration tool.
- Well versed with RDBMS systems like Sql server, DDL, DML scripts.
- Sound knowledge of T-SQL and stored procedures.
- Strong Query writing skills, Scheduling tools.
- Ensure quality deliverable and ability to extend work hours if required.
- Azure Certification: Preferred.
- Ability to understand Customer requirements and convert the requirements into technical specifications.
- Ability to understand the overall solution and design components to fit the overall solution.
- Ability to take technical decisions and support the decisions with justifications.
- Ability to work in Agile manner.
- Ability to work independently without much guidance.
- Self-Motivated, team player, results oriented, strong Time Management skills.
- Excellent written and verbal communication skills.
- Willing to travel to outside India (Short term as well as long term)

