
About Wibmo
About
Connect with the team
Similar jobs
Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, AWS CLI, Amazon
EMR, Amazon MSK, Amazon Sagemaker.
∙Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or
data lakes.
∙Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.
∙Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.
∙Troubleshooting and resolving issues related to data processing and ETL workflows.
∙Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.
∙Documenting ETL processes, data mappings, and system architecture.
Job Description: Oracle BI Publisher Developer
Position Type
• Work Type: Full-time
• Employment Type: Contract
Experience Required
• Minimum 1 year of hands-on experience in:
o Oracle Database: SQL development, performance tuning, data modelling
o Oracle BI Publisher: Report design, template customization, data source integration
Technical Skills
• Mandatory:
o Oracle SQL & PL/SQL
o BI Publisher report development and deployment
o XML and XSLT for template customization
• Preferred:
o Experience with Oracle E-Business Suite or Fusion Applications
o Familiarity with data visualization principles
o Basic understanding of performance metrics and report optimization
Responsibilities
• Design, develop, and maintain BI Publisher reports based on business requirements
• Write and optimize SQL queries for data extraction and transformation
• Collaborate with stakeholders to ensure report accuracy and usability
• Troubleshoot and resolve issues related to data and report performance
• Document technical specifications and maintain version control
Responsibilities
- Manage and drive a team of Data Analysts and Sr. Data Analysts to provide logistics and supply chain solutions.
- Conduct meetings with Clients to gather the requirements and understand the scope.
- Conduct meetings with internal stake holders to walk them through the solution and handover the analysis.
- Define business problems, identify solutions, provide analysis and insights from the client's data.
- 5 Conduct scheduled progress reviews on all projects and interact with onsite team daily.
- Ensure solutions are delivered error free and submitted on time.
- Implement ETL processes using Pentaho Data Integration (Pentaho ETL)Design and implement data models in Hadoop.
- Provide end-user training and technical assistance to maximize utilization of tools.
- Deliver technical guidance to team, including hands-on development as necessary; oversee standards, change controls and documentation library for training and reuse.
Requirements
- Bachelor's degree in Engineering.
- 16+ years of experience in Supply Chain and logistics or related industry and Analytics experience.
- 3 years of experience in team handling(8+People) and interacting with the executive leadership teams.
- Strong project and time management skills with ability to multitask and prioritize workload.
- Solid expertise with MS Excel, SQL, any visualization tools like Tableau/Power BI, any ETL tools.
- Proficiency in Hadoop / Hive.
- Experience Pentaho ETL, Pentaho Visualization API, Tableau.
- Hands on experience of working with Big data sets (Data sets with millions of records).
- Strong technical and Management experience.
Desired Skills and Experience
- NET,ASP.NET
Job description Position: Data Engineer Experience: 6+ years Work Mode: Work from Office Location: Bangalore Please note: This position is focused on development rather than migration. Experience in Nifi or Tibco is mandatory.Mandatory Skills: ETL, DevOps platform, Nifi or Tibco We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in developing and maintaining our data infrastructure and ensuring the smooth operation of our data platforms. The ideal candidate should have a strong background in advanced data engineering, scripting languages, cloud and big data technologies, ETL tools, and database structures.
Responsibilities: • Utilize advanced data engineering techniques, including ETL (Extract, Transform, Load), SQL, and other advanced data manipulation techniques. • Develop and maintain data-oriented scripting using languages such as Python. • Create and manage data structures to ensure efficient and accurate data storage and retrieval. • Work with cloud and big data technologies, specifically AWS and Azure stack, to process and analyze large volumes of data. • Utilize ETL tools such as Nifi and Tibco to extract, transform, and load data into various systems. • Have hands-on experience with database structures, particularly MSSQL and Vertica, to optimize data storage and retrieval. • Manage and maintain the operations of data platforms, ensuring data availability, reliability, and security. • Collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. • Stay up-to-date with the latest industry trends and advancements in data engineering and suggest improvements to enhance our data infrastructure.
Requirements: • A minimum of 6 years of relevant experience as a Data Engineer. • Proficiency in ETL, SQL, and other advanced data engineering techniques. • Strong programming skills in scripting languages such as Python. • Experience in creating and maintaining data structures for efficient data storage and retrieval. • Familiarity with cloud and big data technologies, specifically AWS and Azure stack. • Hands-on experience with ETL tools, particularly Nifi and Tibco. • In-depth knowledge of database structures, including MSSQL and Vertica. • Proven experience in managing and operating data platforms. • Strong problem-solving and analytical skills with the ability to handle complex data challenges. • Excellent communication and collaboration skills to work effectively in a team environment. • Self-motivated with a strong drive for learning and keeping up-to-date with the latest industry trends.
You will:
- Write excellent production code and tests and help others improve in code-reviews
- Analyze high-level requirements to design, document, estimate, and build systems
- Coordinate across teams to identify, resolve, mitigate and prevent technical issues
- Coach and mentor engineers within the team to develop their skills and abilities
- Continuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processes
You have:
For (Fullstack):
- 2 - 10 Years of experience
- Strong with DS & Algorithms
- Hands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.
- Experience with AWS.
For (Geo Team):
- 4 - 10 years of experience
- Experience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etc
- Experience using object-oriented languages (Java, Python)
- Experience in working with different AWS technologies.
- Experience in software design, architecture and development.
- Excellent competencies in data structures & algorithms.
For (Backend):
- 2 - 10 years of experience
- Hands on product development experience using Java/ C++/Python
- Experience with AWS,SQL,GIT
- Strong with Data structures and Algorithms
Additional nice to have skills/certifications:
For Java skill set:
Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, Perl
For Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, Redshift
For (Data Engineering):
- 2 - 10 years of experience
- Experience with object-oriented/object function scripting languages: Python.
- Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, Glue
- Must be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)
- Experience in big data technologies like Hadoop, Map Reduce, Spark, etc
- Experience with Amazon Web Services and Dockers
About SteelEye
SteelEye is a fast growing FinTech company based in London and has offices in Bangalore and Paris, that offers a data platform to help financial institutions such as Investment Banks, Hedge Funds, Brokerage Firms, Asset Management Firms to comply with financial regulations in the European Union. Our clients can aggregate, search, surveillance and report on trade, communications and market data. SteelEye also enables customers to gain powerful insights from their data, helping them to trade with greater efficiency and profitability. The company has a highly experienced management team and a strong board, who have decades of technology and management experience and worked in senior positions at many leading international financial businesses. We are looking to hire a seasoned SRE to join us as we start on our next phase of growth. We have a culture of openness, collaboration, and the passion to get things done whilst appreciating the importance of a good work life balance.
Being part of a start-up can be equally exciting as it is challenging. You will be part of the SteelEye team not just because of your talent but also because of your entrepreneurial flare which we thrive on at SteelEye. This means we want you to be curious, contribute, ask questions and share ideas. We encourage you to get involved in helping shape our business.
What you’ll do
- Deliver plugins for our Python-based ETL pipelines.
- Deliver Python microservices for provisioning and managing cloud infrastructure.
- Implement algorithms to analyse large data sets.
- Draft design documents that translate requirements into code.
- Deal with challenges associated with handling large volumes of data.
- Assume responsibilities from technical design through technical client support.
- Manage expectations with internal stakeholders and context-switch in a fast paced environment.
- Thrive in an environment that uses AWS and Elasticsearch extensively.
- Keep abreast of technology and contribute to the engineering strategy.
- Champion best development practices and provide mentorship.
What we’re looking for
- Experience in o Python 3.
- o Python libraries used for data (such as pandas, numpy).
- o AWS.
- o Elasticsearch.
- o Performance tuning.
- o Object Oriented Design and Modelling.
- o Delivering complex software, ideally in a FinTech setting.
- o CI/CD tools.
- Knowledge of design patterns.
- Sharp analytical and problem-solving skills.
- Strong sense of ownership.
- Demonstrable desire to learn and grow.
- Excellent written and oral communication skills.
- Mature collaboration and mentoring abilities.
About SteelEye Culture
- Work from home until you are vaccinated against COVID-19
- Top of the line health insurance • Order discounted meals every day from a dedicated portal
- Fair and simple salary structure
- 30+ holidays in a year
- Fresh fruits every day
- Centrally located. 5 mins to the nearest metro station (MG Road)
- Measured on output and not input
Position Engineering Lead (Platforms) / Product
Architect
Location Bangalore
About Us
The Arth Group comprises companies that operate in the areas of Technology Solutions, Software
Products & Services, Rural Services, and Digital Healthcare. The technology organization serves
as a shared services organization supporting all the group companies.
Job Description Summary
The Engineering Lead (Platforms) / Product Architect will be responsible for leading the
engineering initiatives for the technology platforms being built by the group companies. The
equivalent names for this position are Platform Architect, Product Architect, Software Engineering
Lead, and Product Engineering Lead.
Core Responsibilities
1 Architecture, high level design and detailed design of multiple software platforms.
2 Hands-on development for prototyping and initial releases of software, working closely with
product manager to execute on the product backlog
3 Leading and guiding a technical team of software developers for major releases, Code
reviews and overall responsibility for Tech Quality Assurance (TQA)
4 Technology risk identification and mitigation, new tech evaluation
Example Deliverables
1 Architecture / Design blueprints for both development and deployment
2 Working software – e.g. mobile applications, server applications, APIs, database applications,
integration services
3 Engineering / build plan for feature development, working platform features as per the product
backlog, fixes to any issues reported, enhancements to any features as per user requirements
4 New technology evaluation report
Qualification & Experience
1 Bachelor’s / Masters’ degree from a reputed university
2 Engineering / Architecture certifications (e.g. TOGAF) would be a plus
3 Years of experience: 15-20 years.
4 Experience of Fintech organization or Financial services expertise is a plus.
Technical Competencies
Proven experience in architecting and building software platforms, in particular cloud based
multi-tenant, SaaS architecture
Software development expertise
Back end technologies e.g. Java, Python, PHP, C# .Net, JavaScript (Node), etc.
Front end technologies e.g. HTML/CSS and frameworks e.g. Bootstrap, Angular,
React, Express, etc.
Mobile applications e.g. native (Android, iOS) and hybrid
Database technologies e.g. MySQL, Oracle, MS SQL Server, MongoDB, etc.
Cloud technologies e.g. Web Services, REST APIs, API gateways, micro services,
multi-tenant SaaS applications, usage of Cloud AI services
3 Experience in agile software development
4 Experience in low code platforms and prototyping
5 Expertise in data management & analytics, data visualization, dashboard development e.g.
using Microsoft Power BI, Tableau and similar platforms
Behavioural Competencies
1 Ability to work both as an individual contributor as well as part of a larger team
2 Leadership ability – mentoring and guiding a team of developers, reviewing their software
deliverables, and helping them address any issues
3 Ability to work against stringent deadlines and as per a clearly defined release roadmap and
plan
4 Possessing a positive, can do attitude – “any problem can be solved”
Reporting and Business Intelligence
Minimum of 8 to 10 years of experience in Report building and Business Intelligence with
Profound knowledge on Oracle BI Publisher and BI Analytics with Oracle Fusion HCM domain especially in Core HR, Payroll and Absences modules
Primary skill:
- Experience generating BI publisher reports, dashboards and drill-down reports
- BI Publisher (RTF design/ eText/ Scheduling/ Parameter Handling/Bursting/backup and migration of reports to different pods)
- Design Oracle BI Publisher Data Models, Data sets and Templates/sub-templates
- Strong PL/SQL and Advanced PL/SQL experience
- Solid understanding of performance tuning best practices and experience improving end-to-end processing times
- Knowledge on the data models related to Fusion HCM (Core HR, Absence and Payroll modules)
Nice to have skills:
- Experience generating BI publisher reports, dashboards and drill-down reports
- Strong OBI Administrator activity
- Create mobile applications with Oracle Business Intelligence Mobile App Designer
- Use BI Mobile to access BI Content
- Create and modify Interactive Dashboards
- Knowledge of additional reporting tools viz. HCM Extract / OTBI / Analysis Dashboard would be an added advantage
Behavioural and Process Skills
- Experience working in Agile teams
- Self-motivated and self-directed abilities to prioritize and execute tasks in a high-pressure environment with "time-critical" deadlines
- Proven analytical, evaluative, and problem-solving abilities
- Possesses a team and customer service provision orientation
We are urgently looking Azure Data Factory Engineers with 3 to 7 years of overall experience, having good experience on Azure Data Factory and SSIS.
Requirements:
- Qualification: Engineering Graduate.
- Years of Experience: 2+ yrs of relevant experience on Azure Data Factory & SSIS.
- Strong understanding of ETL/ELT concepts.
- Hands on experience on Azure Data Factory.
- Exposure of Cloud technologies, standards, and platforms.
- Sound knowledge of ADF pipeline, configuration, parameters, variables, Integration services runtime (part of Data Factory).
- Strong Experience on SSIS.
- Sound knowledge of implementing slowly changing dimensions and transactional, fact less, snapshot fact tables.
- Azure Data Warehouse.
- Knowledge on Azure components especially file storage and Azure Sql, Blob storage and Data Lake Gen2 storage (just used as a file system on the blob).
- Azure Data Migration tool.
- Well versed with RDBMS systems like Sql server, DDL, DML scripts.
- Sound knowledge of T-SQL and stored procedures.
- Strong Query writing skills, Scheduling tools.
- Ensure quality deliverable and ability to extend work hours if required.
- Azure Certification: Preferred.
- Ability to understand Customer requirements and convert the requirements into technical specifications.
- Ability to understand the overall solution and design components to fit the overall solution.
- Ability to take technical decisions and support the decisions with justifications.
- Ability to work in Agile manner.
- Ability to work independently without much guidance.
- Self-Motivated, team player, results oriented, strong Time Management skills.
- Excellent written and verbal communication skills.
- Willing to travel to outside India (Short term as well as long term)









