


Responsibilities:
- Designing and implementing fine-tuned production ready data/ML pipelines in Hadoop platform.
- Driving optimization, testing and tooling to improve quality.
- Reviewing and approving high level & amp; detailed design to ensure that the solution delivers to the business needs and aligns to the data & analytics architecture principles and roadmap.
- Understanding business requirements and solution design to develop and implement solutions that adhere to big data architectural guidelines and address business requirements.
- Following proper SDLC (Code review, sprint process).
- Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, etc.
- Building robust and scalable data infrastructure (both batch processing and real-time) to support needs from internal and external users.
- Understanding various data security standards and using secure data security tools to apply and adhere to the required data controls for user access in the Hadoop platform.
- Supporting and contributing to development guidelines and standards for data ingestion.
- Working with a data scientist and business analytics team to assist in data ingestion and data related technical issues.
- Designing and documenting the development & deployment flow.
Requirements:
- Experience in developing rest API services using one of the Scala frameworks.
- Ability to troubleshoot and optimize complex queries on the Spark platform
- Expert in building and optimizing ‘big data’ data/ML pipelines, architectures and data sets.
- Knowledge in modelling unstructured to structured data design.
- Experience in Big Data access and storage techniques.
- Experience in doing cost estimation based on the design and development.
- Excellent debugging skills for the technical stack mentioned above which even includes analyzing server logs and application logs.
- Highly organized, self-motivated, proactive, and ability to propose best design solutions.
- Good time management and multitasking skills to work to deadlines by working independently and as a part of a team.

Similar jobs
Role: Content Writer - Influencer Script
Exp: 1-3 Years
Salary: up to 6 LPA
City: Mumbai
Job Description:
- Develop compelling scripts tailored to the content concept and vision for influencers to use in video productions.
- Infuse creativity and originality into scriptwriting to captivate the audience and align with the influencer's brand and style.
- Focus solely on scriptwriting for influencers; experience in advertising copywriting is not applicable for this role.
- Collaborate closely with influencers, content creators, and other team members to ensure scripts meet the desired objectives and resonate with the target audience.
- Stay updated on current trends and preferences in influencer marketing and social media content to continuously enhance script quality and relevance.
- Adapt writing style and tone to suit various platforms and audience demographics while maintaining consistency with the influencer's brand identity.
- Incorporate feedback and iterate on scripts as necessary to optimize engagement and effectiveness.
Bangalore / Chennai
- Hands-on data modelling for OLTP and OLAP systems
- In-depth knowledge of Conceptual, Logical and Physical data modelling
- Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
- Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
- Should have working experience on at least one data modelling tool, preferably DBSchema, Erwin
- Good understanding of GCP databases like AlloyDB, CloudSQL, and BigQuery.
- People with functional knowledge of the mutual fund industry will be a plus
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Required Skills:
● Bachelor’s degree in Computer Science or similar field or equivalent work experience.
● 5+ years of experience on Data Warehousing, Data Engineering or Data Integration projects.
● Expert with data warehousing concepts, strategies, and tools.
● Strong SQL background.
● Strong knowledge of relational databases like SQL Server, PostgreSQL, MySQL.
● Strong experience in GCP & Google BigQuery, Cloud SQL, Composer (Airflow), Dataflow, Dataproc, Cloud Function and GCS
● Good to have knowledge on SQL Server Reporting Services (SSRS), and SQL Server Integration Services (SSIS).
● Knowledge of AWS and Azure Cloud is a plus.
● Experience in Informatica Power exchange for Mainframe, Salesforce, and other new-age data sources.
● Experience in integration using APIs, XML, JSONs etc.


This role is physically based out of Mauritius and will require you to relocate to Mauritius. Relocation expenses like air tickets/work visa/permit/medical for work permit will be borne by the company
The role of a senior java developer is to ensure high quality software development and delivery to the end client.
As senior software developer, you shall be the technical SME on the project and ensure that the team abide by the quality standard set on the project.
On the Java Expertise Center, you shall be contributing to the knowledge building committee where you will be coaching juniors and demy students
Key Skills:
Very good knowledge in Java (at least from Java 8).
Good knowledge of Spring boot
Experience with JPA/Hibernate
Experience in middleware such as MessageQueues
Experience with REST interfaces and GRPC
Experience with modern agile software development (Continuous Integration)
Experience in securing APIs
Knowledge in design patterns
1. Software Development
Act autonomously in the delivery of tasks of simple to high complexities on projects.
Participate actively with team members to reduce risks related to tasks and activities.
Ensure deliveries (code, documentation, release notes) are executed within set quality standards and processes.
Understand and apply standard methods, tools and processes in daily tasks.
Participate in sprint backlog estimation.
2. Team coaching & technical reference on project.
Work with the team architect to help coach team members and ensure they
abide by the technical standards set on the project.
Coach new joiners and accompany them technically & functionally on the
project.
Be the technical reference on the project, besides the architect.
Communicate technical KPI to your project manager.
Review code contributed by team members, ensuring adherence to coding standards, best practices, and quality standards.
Act as a coach or trainer and contribute actively to the Java Expertise Center.
A commitment to continuous learning and staying updated with industry trends and best practices
Participate in budget auditing activities as required by direct report in the capacity of a Developer
Key Dimensions:
Ability to monitor unit test coverage and ensure team abide by same standard (Standard unit test coverage
of 60% on new and overall codes)
Ensures Sonar analysis of project is as per quality gate and any deviations are tackled by the team
Ensuring highest quality delivered to client.

We are seeking a Production Support Engineer to join our team.
Responsibilites:
- Be the first line of defense for production and test environment issues.
- Work collaboratively with the team to identify, manage, and resolve ongoing incidents.
- Troubleshoot and connect with appropriate teams to effectively triage issues impacting test and production environments.
- Understand system architecture, upstream, and downstream dependencies to enable effective participation in triage and restoration activities.
- Perform systems monitoring of applications within the IRS domain after service restoration and post patching, maintenance, and upgrades.
- Create necessary service tickets and ensure tickets are routed to the appropriate technical teams.
- Provide weekend support for various activities including patching, release deployments, security updates, and 3rd party updates.
- Keep up with info alerts, patching alerts, and delivery partners' activities.
- Update stakeholders to plan for upcoming maintenance as well as alert them about service issues and restoration.
- Manage and communicate about upcoming maintenance in the test environment on a daily basis.
- Liaise with various stakeholders to gain approval for alert communications, including confirmation before an all-clear communication.
- Work closely with testing and development teams to prepare for infrastructure updates and release readiness.
- Submit Application Redirects tickets for planned maintenance after gaining approval from management.
- Participate in analysis and improvement of system performance.
- Host daily operational standup.
- Provide additional support to existing production support procedures and process improvements.
- Provide regular status reports to management on application status and other metrics.
- Collaborate with management to improve and customize reports related to production support.
- Plan and manage support for incident management tools and processes.
Requirements:
- Bachelor's Degree in computer science, engineering, or related field.
- AWS Cloud certification.
- 3+ years of relevant IT work experience with cloud experience.
- Knowledge of Java and microservice development and deployments.
- Understanding of the business processes behind applications.
- Strong analytical, problem-solving, negotiation, task and project management, and organizational skills.
- Strong oral and written communication skills, including process documentation.
- Proficiency in Microsoft Office applications (Word, PowerPoint, Excel, and Project).
- Proficiency in knowledge of computer systems, databases, and SharePoint.
- Knowledge of Splunk and AppDynamics.
Benefits:
- Work Location: Remote
- 5 days working
You can apply directly through the link: https://zrec.in/gQWFK?source=CareerSite
Explore our Career Page for more such jobs : careers.infraveo.com

Required Qualifications:
∙Bachelor’s degree in computer science, Information Technology, or related field, or equivalent experience.
∙5+ years of experience in a DevOps role, preferably for a SaaS or software company.
∙Expertise in cloud computing platforms (e.g., AWS, Azure, GCP).
∙Proficiency in scripting languages (e.g., Python, Bash, Ruby).
∙Extensive experience with CI/CD tools (e.g., Jenkins, GitLab CI, Travis CI).
∙Extensive experience with NGINX and similar web servers.
∙Strong knowledge of containerization and orchestration technologies (e.g., Docker, Kubernetes).
∙Familiarity with infrastructure-as-code tools (e.g. Terraform, CloudFormation).
∙Ability to work on-call as needed and respond to emergencies in a timely manner.
∙Experience with high transactional e-commerce platforms.
Preferred Qualifications:
∙Certifications in cloud computing or DevOps are a plus (e.g., AWS Certified DevOps Engineer,
Azure DevOps Engineer Expert).
∙Experience in a high availability, 24x7x365 environment.
∙Strong collaboration, communication, and interpersonal skills.
∙Ability to work independently and as part of a team.
Mandatory Skills |
Good to Have Skills |
JDK 1.8 + |
PLSQL |
Microservices |
Knowledge of SQL Performance Tuning |
Spring framework |
Experience with Cloud |
Spring Boot |
Knowledge of Cloud Foundry (Pivotal CF) |
Java Messaging Services (JMS) / Kafka / Rabbit MQ |
On-call support experience with PagerDuty, Service Now |
SOAP & REST APIs |
Knowledge of infrastructure monitoring tools like Nagios, New Relic |
SQL |
Knowledge of Splunk |
ORM technologies like Hibernate / IBatis / MyBatis |
Experience working with Lean and Extreme Programming (XP) |
Continuous integration tools like Jenkins/ Bamboo |
NoSQL databases such as MongoDB or any other NoSQL DB |
Code Quality tools/frameworks like Sonar / PMD |
Automated Integration Testing & Contract Testing |
Automated Testing. Unit Testing, |
|
Agile development methodologies |
|
Our client is an innovative Fintech company that is revolutionizing the business of short term finance. The company is an online lending startup that is driven by an app-enabled technology platform to solve the funding challenges of SMEs by offering quick-turnaround, paperless business loans without collateral. It counts over 2 million small businesses across 18 cities and towns as its customers. Its founders are IIT and ISB alumni with deep experience in the fin-tech industry, from earlier working with organizations like Axis Bank, Aditya Birla Group, Fractal Analytics, and Housing.com. It has raised funds of Rs. 100 Crore from finance industry stalwarts and is growing by leaps and bounds.
- Ensuring ease of data availability, with relevant dimensions, using Business Intelligence tools.
- Providing strong reporting and analytical information support to the management team.
- Transforming raw data into essential metrics basis needs of relevant stakeholders.
- Performing data analysis for generating reports on a periodic basis.
- Converting essential data into easy to reference visuals using Data Visualization tools (PowerBI, Metabase).
- Providing recommendations to update current MIS to improve reporting efficiency and consistency.
- Bringing fresh ideas to the table and keen observers of trends in the analytics and financial services industry.
What you need to have:
- MBA/ BE/ Graduate, with work experience of 3+ years.
- B.Tech /B.E.; MBA / PGDM
- Experience in Reporting, Data Management (SQL, MongoDB), Visualization (PowerBI, Metabase, Data studio)
- Work experience (into financial services, Indian Banks/ NBFCs in-house analytics units or Fintech/ analytics start-ups would be a plus.)
- Skilled at writing & optimizing large complicated SQL queries & MongoDB scripts.
- Strong knowledge of Banking/ Financial Services domain
- Experience with some of the modern relational databases
- Ability to work on multiple projects of different nature and self- driven,
- Liaise with cross-functional teams to resolve data issues and build strong reports


Grettings from Alliance Labs.
We are a Financial Tech company and are looking for a software developer to join.
You will be bulding products used by our clients worldwide.
Thanks
Anuj



