50+ SQL Azure Jobs in India
Apply to 50+ SQL Azure Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Azure Jobs and apply today!
at Delivery Solutions
About UPS:
Moving our world forward by delivering what matters! UPS is a company with a proud past and an even brighter future. Our values define us. Our culture differentiates us. Our strategy drives us. At UPS we are customer first, people led and innovation driven. UPS’s India based Technology Development Centers will bring UPS one step closer to creating a global technology workforce that will help accelerate our digital journey and help us engineer technology solutions that drastically improve our competitive advantage in the field of Logistics.
Job Summary:
- Applies the principles of software engineering to design, develop, maintain, test, and evaluate computer software that provide business capabilities, solutions, and/or product suites. Provides systems life cycle management (e.g., analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc.) to ensure delivery of technical solutions is on time and within budget.
- Researches and supports the integration of emerging technologies.
- Provides knowledge and support for applications’ development, integration, and maintenance.
- Develops program logic for new applications or analyzes and modifies logic in existing applications.
- Analyzes requirements, tests, and integrates application components.
- Ensures that system improvements are successfully implemented. May focus on web/internet applications specifically, using a variety of languages and platforms.
REQUIREMENTS
- Experience with Azure Data bricks, SQL, ETL – SSIS Packages – Very Critical.
- Azure Data Factory, Function Apps, DevOps – A must
- Experience with Azure and other cloud technologies
- Database – Oracle, SQL Server and COSMOS experience needed.
- Azure Services (key vault, app config, Blob storage, Redis cache, service bus, event grid, ADLS, App insight etc.)
- Knowledge of STRIIMs
Preffered skills: Microservices experience, preferred. Experience with Angular, .NET core– Not critical
Additional Information **This role will be in-office 3 days a week in Chennai, India **
Required Skill Set :--
- Data Model & Mapping
- MS SQL Database
- Analytics SQL Query
- Genesys Cloud Reporting & Analytics API
- Snow Flake (Good to have)
- Cloud Exposure – AWS or Azure
Technical Experience –
· 5 - 8 Years of experience, preferable at technology or Financial firm
· Strong understanding of data analysis & reporting tools.
· Experience with data mining & machine learning techniques.
· Excellent communication & presentation skills
· Must have at least 2 – 3 years of experience in Data Model/Analysis /mapping
· Must have hands on experience in database tools & technologies
· Must have exposure to Genesys cloud, WFM, GIM, Genesys Analytics API
· Good to have experience or exposure on salesforce, AWS or AZUre , & Genesys cloud
· Ability to work independently & as part of a team
· Strong attention to detail and accuracy.
Work Scope –
- Data Model similar GIM database based on the Genesys Cloud data.
- API to column data mapping.
- Data Model for business for Analytics
- Data base artifacts
- Scripting – Python
- Autosys, TWS job setup.
TVARIT GmbH develops and delivers solutions in the field of artificial intelligence (AI) for the Manufacturing, automotive, and process industries. With its software products, TVARIT makes it possible for its customers to make intelligent and well-founded decisions, e.g., in forward-looking Maintenance, increasing the OEE and predictive quality. We have renowned reference customers, competent technology, a good research team from renowned Universities, and the award of a renowned AI prize (e.g., EU Horizon 2020) which makes Tvarit one of the most innovative AI companies in Germany and Europe.
We are looking for a self-motivated person with a positive "can-do" attitude and excellent oral and written communication skills in English.
We are seeking a skilled and motivated Data Engineer from the manufacturing Industry with over two years of experience to join our team. As a data engineer, you will be responsible for designing, building, and maintaining the infrastructure required for the collection, storage, processing, and analysis of large and complex data sets. The ideal candidate will have a strong foundation in ETL pipelines and Python, with additional experience in Azure and Terraform being a plus. This role requires a proactive individual who can contribute to our data infrastructure and support our analytics and data science initiatives.
Skills Required
- Experience in the manufacturing industry (metal industry is a plus)
- 2+ years of experience as a Data Engineer
- Experience in data cleaning & structuring and data manipulation
- ETL Pipelines: Proven experience in designing, building, and maintaining ETL pipelines.
- Python: Strong proficiency in Python programming for data manipulation, transformation, and automation.
- Experience in SQL and data structures
- Knowledge in big data technologies such as Spark, Flink, Hadoop, Apache and NoSQL databases.
- Knowledge of cloud technologies (at least one) such as AWS, Azure, and Google Cloud Platform.
- Proficient in data management and data governance
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork abilities.
Nice To Have
- Azure: Experience with Azure data services (e.g., Azure Data Factory, Azure Databricks, Azure SQL Database).
- Terraform: Knowledge of Terraform for infrastructure as code (IaC) to manage cloud.
Job Description
The Senior IT Specialist will play a pivotal role in shaping our cloud and data strategies. This role requires an individual who is not only deeply technical but also possesses a strategic mindset to integrate Azure solutions seamlessly into our business operations.
- Lead the administration and optimization of our Azure cloud environment, ensuring robustness, scalability, and security.
- Manage and configure Intune for mobile device and application management, aligning with our security and compliance requirements
- Solution Architecture competency
- Manage and configure Intune for mobile device and application management, aligning with our security and compliance requirements.
- Design and implement data architectures that support scalability, reliability, and performance across our enterprise.
- Develop, deploy, and manage Azure data warehousing solutions, leveraging Azure Synapse Analytics, to support advanced analytics initiatives.
- Administer SQL Server databases, ensuring high availability, optimal performance, and security.
- Collaborate with various teams to understand data needs and deliver scalable and efficient data solutions.
- Conduct regular audits and performance tuning of Azure services and SQL Server databases to ensure cost-effectiveness and efficiency.
- Stay updated with the latest advancements in Azure services, data warehousing technologies, and analytics trends to continually enhance our capabilities.
- Develop comprehensive documentation and best practices to ensure knowledge sharing and consistent application of Azure and data solutions across the organization.
#LI-MD1
Requirements
- Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
- Minimum of 7 years of experience in IT, with at least 4 years focused on Azure cloud services, data architecture, and analytics.
- Extensive experience with Azure services, particularly Azure Active Directory, Intune, Azure SQL Database, Azure Synapse Analytics, and other related services.
- Proficient in designing and implementing complex data architectures and data warehousing solutions.
- In-depth knowledge of SQL Server administration, performance tuning, and security.
- Strong understanding of best practices for cloud security, data protection, and compliance.
- Excellent analytical, problem-solving, and project management skills.
- Effective communication and collaboration abilities, capable of leading projects and working with cross-functional teams.
- Relevant certifications in Azure, data architecture, or SQL Server are highly desirable.
#LI-MD1
Benefits
Benefits (among others)
Growth
- Join a market-leading firm in one of the fastest-growing industries to pioneer the next generation of automated technologies
- Tailored development plan with career coaching
- Regular feedback – we value your opinion, and we appreciate new ideas
- International career and project opportunities
Team
- Immerse yourself in an exciting, positive, and creative atmosphere and generate novel solutions to some of our clients most complex challenges
- Working in different international and cross-cultural teams for each project
at Gipfel & Schnell Consultings Pvt Ltd
Data Engineer
Brief Posting Description:
This person will work independently or with a team of data engineers on cloud technology products, projects, and initiatives. Work with all customers, both internal and external, to make sure all data related features are implemented in each solution. Will collaborate with business partners and other technical teams across the organization as required to deliver proposed solutions.
Detailed Description:
· Works with Scrum masters, product owners, and others to identify new features for digital products.
· Works with IT leadership and business partners to design features for the cloud data platform.
· Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution.
· Maintains culture of open communication, collaboration, mutual respect and productive behaviors; participates in the hiring, training, and retention of top tier talent and mentors team members to new and fulfilling career experiences.
· Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices.
· Explores all technical options when considering solution, including homegrown coding, third-party sub-systems, enterprise platforms, and existing technology components.
· Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support.
· Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts.
· Understands lifecycle of various technology sub-systems that comprise the enterprise data platform (i.e., version, release, roadmap), including current capabilities, compatibilities, limitations, and dependencies; understands and advises of optimal upgrade paths.
· Establishes relationships with key IT, QA, and other corporate partners, and regularly communicates and collaborates accordingly while working on cross-functional projects or production issues.
Job Requirements:
EXPERIENCE:
2 years required; 3 - 5 years preferred experience in a data engineering role.
2 years required, 3 - 5 years preferred experience in Azure data services (Data Factory, Databricks, ADLS, Synapse, SQL DB, etc.)
EDUCATION:
Bachelor’s degree information technology, computer science, or data related field preferred
SKILLS/REQUIREMENTS:
Expertise working with databases and SQL.
Strong working knowledge of Azure Data Factory and Databricks
Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github preferred)
Strong working knowledge of cloud relational databases (Azure Synapse and Azure SQL preferred)
Familiarity with Agile delivery methodologies
Familiarity with NoSQL databases (such as CosmosDB) preferred.
Any experience with Python, DAX, Azure Logic Apps, Azure Functions, IoT technologies, PowerBI, Power Apps, SSIS, Informatica, Teradata, Oracle DB, and Snowflake preferred but not required.
Ability to multi-task and reprioritize in a dynamic environment.
Outstanding written and verbal communication skills
Working Environment:
General Office – Work is generally performed within an office environment, with standard office equipment. Lighting and temperature are adequate and there are no hazardous or unpleasant conditions caused by noise, dust, etc.
physical requirements:
Work is generally sedentary in nature but may require standing and walking for up to 10% of the time.
Mental requirements:
Employee required to organize and coordinate schedules.
Employee required to analyze and interpret complex data.
Employee required to problem-solve.
Employee required to communicate with the public.
Responsibilities:
- Develop application modules independently and fix any bugs promptly.
- Perform unit testing for the development work carried out
- Act as a mentor to the junior resources and provide technical guidance.
- Troubleshoot problems and provide solutions.
- Conduct and participate in project planning & scheduling, design discussions, and provide assistance during testing.
- Remain up to date with the modern industry practices involved in designing & developing high-quality software.
- Ability to perform engineering and identify and fix bottlenecks.
Must have
- Must have at least 2 years of experience in MERN Stack development.
- Technical documentation as the requirements of the project.
- Must possess strong analytical skills to be able to break down complex problems into smaller atomic units of work.
- Good knowledge of Express.js, React and JS libraries.
- Clear understanding of JavaScript and Typescript.
- Sound understanding of MVC and design patterns.
- Excellent grasp of data structures and designing and developing ReST APIs.
- Good skills in either RDBMS (e.g. MySQL or PostgreSQL) or NoSQL (MongoDB or equivalent).
- Experience in developing responsive web applications.
- Good communication skills.
- Willingness to learn and adopt new technologies in a short period of time as required by the project.
- Sound understanding of Agile and Scrum methodologies and ability to participate in Sprint ceremonies.
Nice to have:
- Good grasp of UI / UX concepts
- Experience in using Git & VSCode.
- Knowledge of AWS, Azure, CI / CD, Gitflow, shell scripting
- Ability to build/own/maintain a comprehensive set of component libraries for a React JS UI
- Ability to design/develop for cross-browser/device compatibility
We strive to create an environment where differences are not only accepted but greatly valued; where everyone can make the most of their capabilities and potential. We promote meritocracy, competence and a sharing of ideas and opinions. We are driven by data and believe the diversity, agility, generosity, and curiosity of our people is what sets us apart as an organization and helps us thrive.
Responsibilities:
- Collaborate with cross-functional teams to define, design, and ship new features.
- Develop high-quality software solutions in C#/.NET according to technical specifications.
- Participate in code reviews and provide constructive feedback to peers.
- Debug, troubleshoot, and resolve software defects to ensure optimal performance.
- Assist in the maintenance and enhancement of existing software applications.
- Stay up-to-date with the latest .NET technologies and industry trends.
- Document software features, technical specifications, and implementation details.
- Contribute to the continuous improvement of development processes and best practices.
- Communicate effectively with team members and stakeholders to ensure project success.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or related field.
- Minimum 1 year of professional experience in software development using C# and the .NET framework.
- Solid understanding of object-oriented programming principles and design patterns.
- Proficiency in Microsoft technologies such as ASP.NET, MVC, and Entity Framework.
- Experience with front-end development technologies like HTML, CSS, JavaScript, and jQuery.
- Familiarity with relational databases (e.g., SQL Server) and SQL queries.
- Strong analytical and problem-solving skills with attention to detail.
- Ability to work both independently and collaboratively in a fast-paced environment.
- Excellent verbal and written communication skills.
- Demonstrated willingness to learn and adapt to new technologies and methodologies.
Data Scientist – Program Embedded
Job Description:
We are seeking a highly skilled and motivated senior data scientist to support a big data program. The successful candidate will play a pivotal role in supporting multiple projects in this program covering traditional tasks from revenue management, demand forecasting, improving customer experience to testing/using new tools/platforms such as Copilot Fabric for different purpose. The expected candidate would have deep expertise in machine learning methodology and applications. And he/she should have completed multiple large scale data science projects (full cycle from ideation to BAU). Beyond technical expertise, problem solving in complex set-up will be key to the success for this role. This is a data science role directly embedded into the program/projects, stake holder management and collaborations with patterner are crucial to the success on this role (on top of the deep expertise).
What we are looking for:
- Highly efficient in Python/Pyspark/R.
- Understand MLOps concepts, working experience in product industrialization (from Data Science point of view). Experience in building product for live deployment, and continuous development and continuous integration.
- Familiar with cloud platforms such as Azure, GCP, and the data management systems on such platform. Familiar with Databricks and product deployment on Databricks.
- Experience in ML projects involving techniques: Regression, Time Series, Clustering, Classification, Dimension Reduction, Anomaly detection with traditional ML approaches and DL approaches.
- Solid background in statistics, probability distributions, A/B testing validation, univariate/multivariate analysis, hypothesis test for different purpose, data augmentation etc.
- Familiar with designing testing framework for different modelling practice/projects based on business needs.
- Exposure to Gen AI tools and enthusiastic about experimenting and have new ideas on what can be done.
- If they have improved an internal company process using an AI tool, that would be great (e.g. process simplification, manual task automation, auto emails)
- Ideally, 10+ years of experience, and have been on independent business facing roles.
- CPG or retail as a data scientist would be nice, but not number one priority, especially for those who have navigated through multiple industries.
- Being proactive and collaborative would be essential.
Some projects examples within the program:
- Test new tools/platforms such as Copilo, Fabric for commercial reporting. Testing, validation and build trust.
- Building algorithms for predicting trend in category, consumptions to support dashboards.
- Revenue Growth Management, create/understand the algorithms behind the tools (can be built by 3rd parties) we need to maintain or choose to improve. Able to prioritize and build product roadmap. Able to design new solutions and articulate/quantify the limitation of the solutions.
- Demand forecasting, create localized forecasts to improve in store availability. Proper model monitoring for early detection of potential issues in the forecast focusing particularly on improving the end user experience.
KEY RESPONSIBILITIES
· Develop high-quality database solutions.
· Use T-SQL to develop and implement procedures and functions.
· Review and interpret ongoing business report requirements.
· Research required data.
· Build appropriate and useful reporting deliverables.
· Analyze existing SQL queries for performance improvements.
· Suggest new queries.
· Develop procedures and scripts for data migration.
· Provide timely scheduled management reporting.
· Investigate exceptions with regard to asset movements.
MUST-HAVES FOR THIS GIG
T-SQL, Stored Procedure, Functions, Triggers, XML Operations, JSON support on SQL 2016 and above SSIS, SSRS, CTE, EAV Data structure, Integration with NoSQL(MongoDB), SQL Server Indexes, Bulk Insert, BCP, CMD Shell ,Memory Optimization, Performance Tunning, Query Optimization, Database Designing, Table Joins, SQL Server Job agent
Backup and Maintenance plan ,Data Migration, Good Communication
NICE-TO-HAVES FOR THIS GIG:
- Working knowledge of mobile development activity.
- Working knowledge of web hosting solution on IIS7.
Experience working with an offshore –onsite development process
Title:- Data Scientist
Experience:-6 years
Work Mode:- Onsite
Primary Skills:- Data Science, SQL, Python, Data Modelling, Azure, AWS, Banking Domain (BFSI/NBFC)
Qualification:- Any
Roles & Responsibilities:-
1. Acquiring, cleaning, and preprocessing raw data for analysis.
2. Utilizing statistical methods and tools for analyzing and interpreting complex datasets.
3. Developing and implementing machine learning models for predictive analysis.
4. Creating visualizations to effectively communicate insights to both technical and non-technical stakeholders.
5. Collaborating with cross-functional teams, including data engineers, business analysts, and domain experts.
6. Evaluating and optimizing the performance of machine learning models for accuracy and efficiency.
7. Identifying patterns and trends within data to inform business decision-making.
8. Staying updated on the latest advancements in data science, machine learning, and relevant technologies.
Requirement:-
1. Experience with modeling techniques such as Linear Regression, clustering, and classification techniques.
2. Must have a passion for data, structured or unstructured. 0.6 – 5 years of hands-on experience with Python and SQL is a must.
3. Should have sound experience in data mining, data analysis and machine learning techniques.
4. Excellent critical thinking, verbal and written communications skills.
5. Ability and desire to work in a proactive, highly engaging, high-pressure, client service environment.
6. Good presentation skills.
Bachelor’s Degree in Information Technology or related field desirable.
• 5 years of Database administrator experience in Microsoft technologies
• Experience with Azure SQL in a multi-region configuration
• Azure certifications (Good to have)
• 2+ Years’ Experience in performing data migrations upgrades/modernizations, performance tuning on IaaS and PaaS Managed Instance and SQL Azure
• Experience with routine maintenance, recovery, and handling failover of a databases
Knowledge about the RDBMS e.g., Microsoft SQL Server or Azure cloud platform.
• Expertise Microsoft SQL Server on VM, Azure SQL Managed Instance, Azure SQL
• Experience in setting up and working with Azure data warehouse.
Roles and Responsibilities:
1. Develop, enhance, document, and maintain application features in C#/Asp.Net.
2. Excellent understanding of Database concepts and strong ability to write well-tuned SQL Statements.
3. Participate in design, code and test inspections throughout product life cycle to contribute technical expertise and to identify issues.
4. Knowledge of developing desktop-based applications is also desirable.
5. Understand technical project priorities, implementation, dependencies, risks and issues.
6. Big Data knowledge is a Plus.
7. Drive design reviews while adhering to security requirements.
8. Provide direction for .Net developers and act as escalation point for questions or issues.
9. Performs technical analysis to identify and troubleshoot application code – related issues.
Keyskills
1. Must have hands on experience with ASP_Net, ASP_Net_MVC, Asp .net 4.5 , Core
2. Must have good experience in C#
3. Must have good experience in API
4. Must have hands on experience in HTML_5, angular js,CSS,CSS,CSS3, jQuery, Bootstrap, jAVASCRIPT,
5, Must have experience in Microsoft_SQL_server
This opening is with an MNC
ROLE AND RESPONSIBILITIES
Should be able to work as an individual contributor and maintain good relationship with stakeholders. Should
be proactive to learn new skills per business requirement. Familiar with extraction of relevant data, cleanse and
transform data into insights that drive business value, through use of data analytics, data visualization and data
modeling techniques.
QUALIFICATIONS AND EDUCATION REQUIREMENTS
Technical Bachelor’s Degree.
Non-Technical Degree holders should have 1+ years of relevant experience.
Roles and Responsibilities
• Ability to create solution prototype and conduct proof of concept of new tools.
• Work in research and understanding of new tools and areas.
• Clearly articulate pros and cons of various technologies/platforms and perform
detailed analysis of business problems and technical environments to derive a
solution.
• Optimisation of the application for maximum speed and scalability.
• Work on feature development and bug fixing.
Technical skills
• Must have knowledge of the networking in Linux, and basics of computer networks in
general.
• Must have intermediate/advanced knowledge of one programming language,
preferably Python.
• Must have experience of writing shell scripts and configuration files.
• Should be proficient in bash.
• Should have excellent Linux administration capabilities.
• Working experience of SCM. Git is preferred.
• Knowledge of build and CI-CD tools, like Jenkins, Bamboo etc is a plus.
• Understanding of Architecture of OpenStack/Kubernetes is a plus.
• Code contributed to OpenStack/Kubernetes community will be a plus.
• Data Center network troubleshooting will be a plus.
• Understanding of NFV and SDN domain will be a plus.
Soft skills
• Excellent verbal and written communications skills.
• Highly driven, positive attitude, team player, self-learning, self motivating and flexibility
• Strong customer focus - Decent Networking and relationship management
• Flair for creativity and innovation
• Strategic thinking This is an individual contributor role and will need client interaction on technical side.
Must have Skill - Linux, Networking, Python, Cloud
Additional Skills-OpenStack, Kubernetes, Shell, Java, Development
🚀 Exciting Opportunity: Data Engineer Position in Gurugram 🌐
Hello
We are actively seeking a talented and experienced Data Engineer to join our dynamic team at Reality Motivational Venture in Gurugram (Gurgaon). If you're passionate about data, thrive in a collaborative environment, and possess the skills we're looking for, we want to hear from you!
Position: Data Engineer
Location: Gurugram (Gurgaon)
Experience: 5+ years
Key Skills:
- Python
- Spark, Pyspark
- Data Governance
- Cloud (AWS/Azure/GCP)
Main Responsibilities:
- Define and set up analytics environments for "Big Data" applications in collaboration with domain experts.
- Implement ETL processes for telemetry-based and stationary test data.
- Support in defining data governance, including data lifecycle management.
- Develop large-scale data processing engines and real-time search and analytics based on time series data.
- Ensure technical, methodological, and quality aspects.
- Support CI/CD processes.
- Foster know-how development and transfer, continuous improvement of leading technologies within Data Engineering.
- Collaborate with solution architects on the development of complex on-premise, hybrid, and cloud solution architectures.
Qualification Requirements:
- BSc, MSc, MEng, or PhD in Computer Science, Informatics/Telematics, Mathematics/Statistics, or a comparable engineering degree.
- Proficiency in Python and the PyData stack (Pandas/Numpy).
- Experience in high-level programming languages (C#/C++/Java).
- Familiarity with scalable processing environments like Dask (or Spark).
- Proficient in Linux and scripting languages (Bash Scripts).
- Experience in containerization and orchestration of containerized services (Kubernetes).
- Education in database technologies (SQL/OLAP and Non-SQL).
- Interest in Big Data storage technologies (Elastic, ClickHouse).
- Familiarity with Cloud technologies (Azure, AWS, GCP).
- Fluent English communication skills (speaking and writing).
- Ability to work constructively with a global team.
- Willingness to travel for business trips during development projects.
Preferable:
- Working knowledge of vehicle architectures, communication, and components.
- Experience in additional programming languages (C#/C++/Java, R, Scala, MATLAB).
- Experience in time-series processing.
How to Apply:
Interested candidates, please share your updated CV/resume with me.
Thank you for considering this exciting opportunity.
Job Description
Technical lead who will be responsible for development, managing team(s), monitoring the tasks / sprint. They will also work with BA Persons to gather the new requirements and change request. They will help solve application issues and helping developers when they are stuck.
Responsibilities
· Design and develop application based on the architecture provided by the solution architects.
· Help team members and co developers to achieve their tasks.
· Maintain / monitor the new work items and support issues and have to assign it to the respective developers.
· Communicate with BA persons and Solution architects for the new requirements and change requests.
· Resolve any support tickets with the help of your team within service timelines.
· Manage sprint to achieve the targets.
Technical Skills
· Microsoft .NET MVC
· .NET Core 3.1 or greater
· C#
· Web API
· Async Programming, Threading, and tasks
· Test Driven Development
· Strong expert in SQL (Table Design, Programing, Optimization)
· Azure Functions
· Azure Storage
· MongoDB, NoSQL
Qualifications/Skills Desired:
· Any Bachelor’s degree relevant to Computer Science. MBA or equivalent is a plus
· Minimum of 8-10 years IT experience and managing a team(s) out of which 4-5 years should be as a technical/team lead.
· Strong verbal and written communication skills with the ability to adapt to many different personalities and conflict resolution skills required
· Must have excellent organizational and time management skills with strong attention to detail
· Confidentiality with privacy-sensitive customer and employee documents
· Strong work ethic - demonstrate good attitude and judgment, discretion, and maintain high level of confidentiality
· Previous experience of customer interactions
is a software product company that provides
5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus
Skills Required :
Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols
Job Description:
As an Azure Data Engineer, your role will involve designing, developing, and maintaining data solutions on the Azure platform. You will be responsible for building and optimizing data pipelines, ensuring data quality and reliability, and implementing data processing and transformation logic. Your expertise in Azure Databricks, Python, SQL, Azure Data Factory (ADF), PySpark, and Scala will be essential for performing the following key responsibilities:
Designing and developing data pipelines: You will design and implement scalable and efficient data pipelines using Azure Databricks, PySpark, and Scala. This includes data ingestion, data transformation, and data loading processes.
Data modeling and database design: You will design and implement data models to support efficient data storage, retrieval, and analysis. This may involve working with relational databases, data lakes, or other storage solutions on the Azure platform.
Data integration and orchestration: You will leverage Azure Data Factory (ADF) to orchestrate data integration workflows and manage data movement across various data sources and targets. This includes scheduling and monitoring data pipelines.
Data quality and governance: You will implement data quality checks, validation rules, and data governance processes to ensure data accuracy, consistency, and compliance with relevant regulations and standards.
Performance optimization: You will optimize data pipelines and queries to improve overall system performance and reduce processing time. This may involve tuning SQL queries, optimizing data transformation logic, and leveraging caching techniques.
Monitoring and troubleshooting: You will monitor data pipelines, identify performance bottlenecks, and troubleshoot issues related to data ingestion, processing, and transformation. You will work closely with cross-functional teams to resolve data-related problems.
Documentation and collaboration: You will document data pipelines, data flows, and data transformation processes. You will collaborate with data scientists, analysts, and other stakeholders to understand their data requirements and provide data engineering support.
Skills and Qualifications:
Strong experience with Azure Databricks, Python, SQL, ADF, PySpark, and Scala.
Proficiency in designing and developing data pipelines and ETL processes.
Solid understanding of data modeling concepts and database design principles.
Familiarity with data integration and orchestration using Azure Data Factory.
Knowledge of data quality management and data governance practices.
Experience with performance tuning and optimization of data pipelines.
Strong problem-solving and troubleshooting skills related to data engineering.
Excellent collaboration and communication skills to work effectively in cross-functional teams.
Understanding of cloud computing principles and experience with Azure services.
Roles and Responsibilities
- o Highly skilled in back-end development using .NET C#
- o Strong experience in Microsoft Azure environment
- o Strong experience in REST API development
- o Experienced in Microservice architecture
- o Experienced in an agile development and project environment
- o Experienced with Microsoft Teams, Git and GitHub
- o Strong ability to produce technical documentation for developed solution
- o Familiar with agile development tools like Atlassian jira favourable
- o Familiar with Azure SQL favourable
- o Understanding of CI/CD favourable
PLSQL Developer
experience of 4 to 6 years
Skills- MS SQl Server and Oracle, AWS or Azure
• Experience in setting up RDS service in cloud technologies such as AWS or Azure
• Strong proficiency with SQL and its variation among popular databases
• Should be well-versed in writing stored procedures, functions, packages, using collections,
• Skilled at optimizing large, complicated SQL statements.
• Should have worked in migration projects.
• Should have worked on creating reports.
• Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.
• Knowledge of best practices when dealing with relational databases
• Capable of troubleshooting common database issues
• Familiar with tools that can aid with profiling server resource usage and optimizing it.
• Proficient understanding of code versioning tools such as Git and SVN
CyntraLabs is focused on providing organizations unified solutions to maximize the business value by utilizing the capabilities offered by emerging technologies. We take pride in providing state-of-the-art solutions in integration and retail that guarantee success in business evolution. Businesses must transform to stay relevant, hence we provide our existing and new partners with foresight to Become agile, Realize new growth opportunities and Adapt to new technologies.
Key Responsibilities:
1. Design and develop MuleSoft and Azure architecture solutions based on client requirements
2. Work with clients to understand their business needs and provide technical guidance on MuleSoft and Azure solutions
3. Collaborate with other architects, developers, and project managers to ensure seamless integration of MuleSoft and Azure solutions with existing systems
4. Create and maintain technical documentation for MuleSoft and Azure architecture solutions
5. Perform architecture assessments and provide recommendations for MuleSoft and Azure solutions
6. Implement MuleSoft and Azure best practices and standards for architecture, design, and development
7. Develop custom MuleSoft and Azure components, connectors, and templates to support project requirements
8. Participate in code reviews and provide feedback to other developers
9. Provide technical leadership and mentorship to other MuleSoft and Azure developers
10. Stay up-to-date with MuleSoft and Azure technology trends and best practices
11. Assist with project scoping, planning, and estimation
12. Communicate with clients to understand their business requirements and provide technical guidance
13. Work on Azure-specific projects, including infrastructure architecture, security, and monitoring
14. Design and implement Azure solutions such as Azure Logic Apps, Functions, Event Grids, and API Management
15. Work with Azure services such as Azure Blob Storage, Azure SQL, and Cosmos DB
16. Integrate MuleSoft and Azure services using APIs and connectors
Eligibility:
1. A Bachelor's degree in Computer Science or related field is preferred
2. 8-10 years of experience in software development, with at least 3 years of experience in MuleSoft and Azure architecture and design
3. Strong understanding of integration patterns, SOA, and API design principles
4. Experience with Anypoint Studio and the MuleSoft Anypoint Platform
5. Hands-on experience with MuleSoft development including creating APIs, connectors, and integration flows
6. Understanding of RESTful web services, HTTP, JSON, and XML
7. Familiarity with software development best practices such as Agile and DevOps
8. Excellent problem-solving skills and ability to troubleshoot complex technical issues
9. Strong communication and interpersonal skills
10. Ability to work independently as well as in a team-oriented, collaborative environment
11. MuleSoft certification as a Solution Architect and Azure certification as an Architect is preferred
Microsoft Azure Data Integration Engineer
Job Description:
Microsoft Azure Data Integration Developer who will design and build cutting-edge user experiences for our client’s consumer-facing desktop application. The Senior Software Developer will work closely with product owners, UX designers, front-end, and back-end developers to help build the next-generation platform.
Key Skills:
- 3+ years of experience in an enterprise or consumer software development environment using C# and designing and supporting Azure environments
- Expert level programming skills in creating MVC and Microservices architecture
- Experience with modern frameworks and design patterns like .Net core
- Strong knowledge of C# language
- Hands-on experience using the Azure administration portal and iPaaS
- Demonstrable experience deploying enterprise workloads to Azure
- Hands-on experience in Azure function apps, app service, logic apps and storage, Azure Key vault integration, and Azure Sql database.
- Solid understanding of object-oriented programming.
- Experience developing user interfaces and customizing UI controls/components
· Microsoft Azure Certification, Business Continuity, or Disaster Recovery planning experience is a plus
Responsibilities:
- Architect, design & build a modern web application for consumers
- Explore configuring hybrid connectivity between on-premises environments and Azure, and how to monitor network performance to comply with service-level agreements.
- Collaborate with UI/UX teams to deliver high-performing and easy-to-use applications
- Participate in code reviews with staff as necessary to ensure a high-quality, performant product
- Develop a deep understanding of client goals and objectives, and articulate how your solutions address their needs
- Unit testing/test-driven development
- Integration testing
- Deploying Azure VMs (Windows Server) in a highly available environment
- Regularly reviewing existing systems and making recommendations for improvements.
- Maintenance
- Post-deployment production support and troubleshooting
Technical Expertise and Familiarity:
- Cloud Technologies: Azure, iPaaS
- Microsoft: .NET Core, ASP.NET, MVC, C#
- Frameworks/Technologies: MVC, Microservices, Web Services, REST API, Java Script, JQuery, CSS, Testing Frameworks
- IDEs: Visual Studio, VS Code
- Databases: MS SQL Server, Azure SQL
- Familiarity with Agile software development methodologies.
- Advanced knowledge of using Git source control system.
- Azure, AWS, and GCP certifications preferred.
EXPERIENCE: 5 – 12 years
LEVEL: Senior & Lead Software Engineers
JOB LOCATION: EPAM India Locations
Must Have Skills :
1. NET Full stack Developer (.NET, C#, JavaScript, Angular 4 & above, PostgreSQL)
2. Experience in Unit testing
3. Hands on experience with building RESTful Web APIs, micro services
4. Experience in Asynchronous programming
5. Good understanding of Authentication and Authorization models in Web APIs
6. Good at Data structure and Algorithms.
7. Experience with Entity framework.
8. Added advantage: Experience in Azure
Project Overview: We are looking for expert level Postgres database developer to work on a software application development project for a fortune 500 US based telecom client. The application is web based and used across multiple teams to support their business processes. The developer will be responsible for developing various components of the Postgres database and for light administration of the database.
Key Responsibilities: Collaborate with onshore, offshore and other team members to understand the user stories and develop code. Develop and execute scripts to unit test. Collaborate with onshore developers, product owner and the client team to perform work in an integrated manner.
Professional Attributes: Should have the ability to work independently and seek guidance as and when necessary - Should have good communication skills - Flexible working in different time zones if necessary - Good team player - Mentoring juniors
Experience preferred:
- Extensive experience in Postgres database development (expert level)
- Experience in Postgres administration.
- Must have working experience with GIS data functionality
- Experience handling large datasets (50-100M tables)
- Preferred – exposure to Azure or AWS
- Must have skillsets for database performance tuning
- Familiarity with web applications
- Ability to work independently with minimal oversight
- Experience working cohesively in integrated teams
- Good interpersonal, communication, documentation and presentation skills.
- Prior experience working in agile environments
- Ability to communicate effectively both orally and in writing with clients, Business Analysts and Developers
- Strong analytical, problem-solving and conceptual skills
- Excellent organizational skills; attention to detail
- Ability to resolve project issues effectively and efficiently
- Ability to prioritize workload and consistently meet deadlines
- Experience working with onshore-offshore model
Criteria:
- BE/MTech/MCA/MSc
- 3+yrs Hands on Experience in TSQL / PL SQL / PG SQL or NOSQL
- Immediate joiners preferred*
- Candidates will be selected based on logical/technical and scenario-based testing
Note: Candidates who have attended the interview process with TnS in the last 6 months will not be eligible.
Job Description:
- Technical Skills Desired:
- Experience in MS SQL Server and one of these Relational DB’s, PostgreSQL / AWS Aurora DB / MySQL / Oracle / NOSQL DBs (MongoDB / DynamoDB / DocumentDB) in an application development environment and eagerness to switch
- Design database tables, views, indexes
- Write functions and procedures for Middle Tier Development Team
- Work with any front-end developers in completing the database modules end to end (hands-on experience in parsing of JSON & XML in Stored Procedures would be an added advantage).
- Query Optimization for performance improvement
- Design & develop SSIS Packages or any other Transformation tools for ETL
- Functional Skills Desired:
- Banking / Insurance / Retail domain would be a
- Interaction with a client a
3. Good to Have Skills:
- Knowledge in a Cloud Platform (AWS / Azure)
- Knowledge on version control system (SVN / Git)
- Exposure to Quality and Process Management
- Knowledge in Agile Methodology
- Soft skills: (additional)
- Team building (attitude to train, work along, mentor juniors)
- Communication skills (all kinds)
- Quality consciousness
- Analytical acumen to all business requirements
Think out-of-box for business solution
Back-end developers design and enable code-based innovation in our suite of web-based platforms and databases. Together with the front-end development team output enabling a intuitive, highly functional and seamless experience that delights our users.
This opportunity will rely on your ability to develop backend code, create, edit, or manipulate platform databases, leverage an API based integration approach for platform development.
We are looking for:
- Prior experience designing and developing backend code in support of web applications and platforms
- Sound programming skills and logic
- Excellent team spirit: including daily engagement, strong communication skills and ability to easily collaborate with various stakeholders
- Excellent time-management skills and accountability for tasks on a daily, weekly and sprint basis
Skills Required:
- Sound knowledge and experience with backend technology such as Node, SQL, Python
- Understanding of secure API’s and the successful use in a commercial setting
- Understanding of JSON files, data manipulation techniques and products such as DataBricks
- Experience with Agile development efforts and Azure Dev/Ops tool
Key initial responsibilities:
- Design, develop, test and compile high quality code, data artifacts , processes, in support of the buildout of our platform and web applications
- Collaborating with Product Owners, Business Analysts, Designers and Front-end developers to establish and satisfy development objectives
- Foster innovation supporting new and existing programs, products, and features through a combination of thought leadership and emerging industry trends
- Reconciling data and reporting deliverables to internal and client teams
- Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members.
- Taking lead on projects as needed.
at Softobiz Technologies Private limited
Responsibilities
- Design and implement Azure BI infrastructure, ensure overall quality of delivered solution
- Develop analytical & reporting tools, promote and drive adoption of developed BI solutions
- Actively participate in BI community
- Establish and enforce technical standards and documentation
- Participate in daily scrums
- Record progress daily in assigned Devops items
Ideal Candidates should have
- 5 + years of experience in a similar senior business intelligence development position
- To be successful in the role you will require a high level of expertise across all facets of the Microsoft BI stack and prior experience in designing and developing well-performing data warehouse solutions
- Demonstrated experience using development tools such as Azure SQL database, Azure Data Factory, Azure Data Lake, Azure Synapse, and Azure DevOps.
- Experience with development methodologies including Agile, DevOps, and CICD patterns
- Strong oral and written communication skills in English
- Ability and willingness to learn quickly and continuously
- Bachelor's Degree in computer science
API Lead Developer
Job Overview:
As an API developer for a very large client, you will be filling the role of a hands-on Azure API Developer. we are looking for someone who has the necessary technical expertise to build and maintain sustainable API Solutions to support identified needs and expectations from the client.
Delivery Responsibilities
- Implement an API architecture using Azure API Management, including security, API Gateway, Analytics, and API Services
- Design reusable assets, components, standards, frameworks, and processes to support and facilitate API and integration projects
- Conduct functional, regression, and load testing on API’s
- Gather requirements and defining the strategy for application integration
- Develop using the following types of Integration protocols/principles: SOAP and Web services stack, REST APIs, RESTful, RPC/RFC
- Analyze, design, and coordinate the development of major components of the APIs including hands on implementation, testing, review, build automation, and documentation
- Work with DevOps team to package release components to deploy into higher environment
Required Qualifications
- Expert Hands-on experience in the following:
- Technologies such as Spring Boot, Microservices, API Management & Gateway, Event Streaming, Cloud-Native Patterns, Observability & Performance optimizations
- Data modelling, Master and Operational Data Stores, Data ingestion & distribution patterns, ETL / ELT technologies, Relational and Non-Relational DB's, DB Optimization patterns
- At least 5+ years of experience with Azure APIM
- At least 8+ years’ experience in Azure SaaS and PaaS
- At least 8+ years’ experience in API Management including technologies such as Mulesoft and Apigee
- At least last 5 years in consulting with the latest implementation on Azure SaaS services
- At least 5+ years in MS SQL / MySQL development including data modeling, concurrency, stored procedure development and tuning
- Excellent communication skills with a demonstrated ability to engage, influence, and encourage partners and stakeholders to drive collaboration and alignment
- High degree of organization, individual initiative, results and solution oriented, and personal accountability and resiliency
- Should be a self-starter and team player, capable of working with a team of architects, co-developers, and business analysts
Preferred Qualifications:
- Ability to work as a collaborative team, mentoring and training the junior team members
- Working knowledge on building and working on/around data integration / engineering / Orchestration
- Position requires expert knowledge across multiple platforms, integration patterns, processes, data/domain models, and architectures.
- Candidates must demonstrate an understanding of the following disciplines: enterprise architecture, business architecture, information architecture, application architecture, and integration architecture.
- Ability to focus on business solutions and understand how to achieve them according to the given timeframes and resources.
- Recognized as an expert/thought leader. Anticipates and solves highly complex problems with a broad impact on a business area.
- Experience with Agile Methodology / Scaled Agile Framework (SAFe).
- Outstanding oral and written communication skills including formal presentations for all levels of management combined with strong collaboration/influencing.
Preferred Education/Skills:
- Prefer Master’s degree
- Bachelor’s Degree in Computer Science with a minimum of 12+ years relevant experience or equivalent.
Qualifications:
BTech/BE in computer science, electrical, electronics or related fields 5+ years of full stack design and development experience High emotional intelligence, empathy and collaborative approach. Experience in Angular Javascript frameworks, CSS, HTML5, NodeJS, ExpressJS, MongoDB to handle full stack web development. Experience with developing rich dynamic front end applications using Angular and CSS frameworks like BulmaCSS, Angular Material, Bootstrap, etc. Knowledge of GraphQL would be an added advantage. Knowledge of Cloud services like AWS, Heroku, Azure is preferable. Should be a quick learner to keep up with the pace of the ever changing world of technology as the candidate will get excellent exposure to the latest and trending Cloud based Saas technologies and best practices while working with varied customers based across the globe.
Responsibilities:
Develop web applications covering end to end software development life cycle right from writing UI code using Angular to backend API code using NodeJS and managing databases like MongoDB, MySQL, etc. Involved in full stack code management from Git check-ins to running automated builds and deployments using DevOps practices to deploy to public cloud services like AWS, Azure, Heroku, etc. Handling full-stack web development workflow right from front end to backend to CI/CD workflows. Design and Develop the tech architecture and work closely with CEO and CTO of the company Drive and guide with work of other engineers on the team
This is a leadership role and candidate is expected to wear multiple technical hats including customer interactions and investor discussions
• Strong experience working with Big Data technologies like Spark (Scala/Java),
• Apache Solr, HIVE, HBase, ElasticSearch, MongoDB, Airflow, Oozie, etc.
• Experience working with Relational databases like MySQL, SQLServer, Oracle etc.
• Good understanding of large system architecture and design
• Experience working in AWS/Azure cloud environment is a plus
• Experience using Version Control tools such as Bitbucket/GIT code repository
• Experience using tools like Maven/Jenkins, JIRA
• Experience working in an Agile software delivery environment, with exposure to
continuous integration and continuous delivery tools
• Passionate about technology and delivering solutions to solve complex business
problems
• Great collaboration and interpersonal skills
• Ability to work with team members and lead by example in code, feature
development, and knowledge sharing
- Microsoft development experience using C#, ASP.NET Core Web API, MVC, Authentication and
Authorization, and proficient in developing large scale web applications using a .Net framework.
- Hands-on working experience with setting up applications using Azure Functions, Azure SQL and
NoSQL
- Guide the development team with building applications on Azure Cloud
- Learn and research new solutions for application development that can be applicable to the
business problems we solve
Requirements:
- Problem solver using data and deep understanding of designing/setting up solutions
- Minimum of 2 years experience using and implementing Azure Active Directory, Azure Functions
with various triggers
- Strong interpersonal and communication skills and flexibility to work US Hours if needed
- Be flexible to learn new technologies and apply them to solve business problems
- Strong Communication skills (verbal and written).
- Extremely detail-oriented and well-organized.
- Ability to positively engage with the clients and build strong long-term relationships.
- Ability to work efficiently and effectively in a high-paced environment and under deadlines.
About Us:
DataBeat.io is a data and analytics services company that provides big data, analytics and
operations management services to various companies globally.
Working at DataBeat.io helps you to be at the forefront of the big data and analytics ecosystem. You
will work with clients who are leading companies that develop breakthrough solutions, concepts that
are shaping the technology world and cutting-edge tools. Fast-growing company where your
performance and contribution could move you into leadership positions fairly quickly.
at Altimetrik
DevOps Architect
Experience: 10 - 12+ year relevant experience on DevOps
Locations : Bangalore, Chennai, Pune, Hyderabad, Jaipur.
Qualification:
• Bachelors or advanced degree in Computer science, Software engineering or equivalent is required.
• Certifications in specific areas are desired
Technical Skillset: Skills Proficiency level
- Build tools (Ant or Maven) - Expert
- CI/CD tool (Jenkins or Github CI/CD) - Expert
- Cloud DevOps (AWS CodeBuild, CodeDeploy, Code Pipeline etc) or Azure DevOps. - Expert
- Infrastructure As Code (Terraform, Helm charts etc.) - Expert
- Containerization (Docker, Docker Registry) - Expert
- Scripting (linux) - Expert
- Cluster deployment (Kubernetes) & maintenance - Expert
- Programming (Java) - Intermediate
- Application Types for DevOps (Streaming like Spark, Kafka, Big data like Hadoop etc) - Expert
- Artifactory (JFrog) - Expert
- Monitoring & Reporting (Prometheus, Grafana, PagerDuty etc.) - Expert
- Ansible, MySQL, PostgreSQL - Intermediate
• Source Control (like Git, Bitbucket, Svn, VSTS etc)
• Continuous Integration (like Jenkins, Bamboo, VSTS )
• Infrastructure Automation (like Puppet, Chef, Ansible)
• Deployment Automation & Orchestration (like Jenkins, VSTS, Octopus Deploy)
• Container Concepts (Docker)
• Orchestration (Kubernetes, Mesos, Swarm)
• Cloud (like AWS, Azure, GoogleCloud, Openstack)
Roles and Responsibilities
• DevOps architect should automate the process with proper tools.
• Developing appropriate DevOps channels throughout the organization.
• Evaluating, implementing and streamlining DevOps practices.
• Establishing a continuous build environment to accelerate software deployment and development processes.
• Engineering general and effective processes.
• Helping operation and developers teams to solve their problems.
• Supervising, Examining and Handling technical operations.
• Providing a DevOps Process and Operations.
• Capacity to handle teams with leadership attitude.
• Must possess excellent automation skills and the ability to drive initiatives to automate processes.
• Building strong cross-functional leadership skills and working together with the operations and engineering teams to make sure that systems are scalable and secure.
• Excellent knowledge of software development and software testing methodologies along with configuration management practices in Unix and Linux-based environment.
• Possess sound knowledge of cloud-based environments.
• Experience in handling automated deployment CI/CD tools.
• Must possess excellent knowledge of infrastructure automation tools (Ansible, Chef, and Puppet).
• Hand on experience in working with Amazon Web Services (AWS).
• Must have strong expertise in operating Linux/Unix environments and scripting languages like Python, Perl, and Shell.
• Ability to review deployment and delivery pipelines i.e., implement initiatives to minimize chances of failure, identify bottlenecks and troubleshoot issues.
• Previous experience in implementing continuous delivery and DevOps solutions.
• Experience in designing and building solutions to move data and process it.
• Must possess expertise in any of the coding languages depending on the nature of the job.
• Experience with containers and container orchestration tools (AKS, EKS, OpenShift, Kubernetes, etc)
• Experience with version control systems a must (GIT an advantage)
• Belief in "Infrastructure as a Code"(IaaC), including experience with open-source tools such as terraform
• Treats best practices for security as a requirement, not an afterthought
• Extensive experience with version control systems like GitLab and their use in release management, branching, merging, and integration strategies
• Experience working with Agile software development methodologies
• Proven ability to work on cross-functional Agile teams
• Mentor other engineers in best practices to improve their skills
• Creating suitable DevOps channels across the organization.
• Designing efficient practices.
• Delivering comprehensive best practices.
• Managing and reviewing technical operations.
• Ability to work independently and as part of a team.
• Exceptional communication skills, be knowledgeable about the latest industry trends, and highly innovative
Mandatory skills:
.Netcore3.x,microservice and AWS or Azure.
Exp : 5-12 years
Budget : Max 30 LPA.
Looking for short notice people.
Job Descriptions: Senior .NET Cloud (Azure) Practitioner
Job Description Experience: 5-12 years (approx.)
Notice period : 15 only
Location : Hyderabad,Bangalore,Chennai.
Mandatory Skills
- Strong Restful API, Micro-services development experience using ASP.NET CORE Web APIs (C#);
- Must have exceptionally good software design and programming skills in .Net Core (.NET 3.X, .NET 6) Platform, C#, ASP.net MVC, ASP.net Web API (RESTful), Entity Framework & LINQ
- Good working knowledge on Azure Functions, Docker, and containers
- Expertise in Microsoft Azure Platform - Azure Functions, Application Gateway, API Management, Redis Cache, App Services, Azure Kubernetes, CosmosDB, Azure Search, Azure Service Bus, Function Apps, Azure Storage Accounts, Azure KeyVault, Azure Log Analytics, Azure Active Directory, Application Insights, Azure SQL Database, Azure IoT, Azure Event Hubs, Azure Data Factory, Virtual Networks and networking.
- Strong SQL Server expertise and familiarity with Azure Cosmos DB, Azure (Blob, Table, queue) storage, Azure SQL etc
- Experienced in Test-Driven Development, unit testing libraries, testing frameworks.
- Good knowledge of Object Oriented programming, including Design Patterns
- Cloud Architecture - Technical knowledge and implementation experience using common cloud architecture, enabling components, and deployment platforms.
- Excellent written and oral communication skills, along with the proven ability to work as a team with other disciplines outside of engineering are a must
- Solid analytical, problem-solving and troubleshooting skills
Desirable Skills:
- Certified Azure Solution Architect Expert
- https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcloudacademy.com%2Flearning-paths%2Faz-900-exam-preparation-microsoft-azure-fundamentals-524%2F&data=05%7C01%7C%7C42a09b32ee3d4f9684fc08dab717d4cb%7Ce9cb3c8041564c39a7fe68fe427a3d46%7C1%7C0%7C638023611149491723%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000%7C%7C%7C&sdata=iWgxDBxJ9DU2Y%2Bg7w2GjPV1JsUFis3XH5qTND56zhM8%3D&reserved=0" target="_blank">Microsoft Certified: Azure – Fundamentals Exam AZ-900
- https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcloudacademy.com%2Flearning-paths%2Faz-104-exam-preparation-microsoft-azure-administrator-1-1332%2F&data=05%7C01%7C%7C42a09b32ee3d4f9684fc08dab717d4cb%7Ce9cb3c8041564c39a7fe68fe427a3d46%7C1%7C0%7C638023611149491723%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000%7C%7C%7C&sdata=NDxe8GGiOPKzgtLByQJ1KEUb18NT%2Bk10J%2FQeMk2EdO0%3D&reserved=0" target="_blank">Microsoft Certified: Azure Administrator – Associate Exam AZ-104
- https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcloudacademy.com%2Flearning-paths%2Faz-204-exam-preparation-developing-solutions-for-microsoft-azure-1208%2F&data=05%7C01%7C%7C42a09b32ee3d4f9684fc08dab717d4cb%7Ce9cb3c8041564c39a7fe68fe427a3d46%7C1%7C0%7C638023611149647967%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000%7C%7C%7C&sdata=LJGujRaAsL%2BAkZ74iW8M83jGb%2FoNGaLtSpvP0I9L4hA%3D&reserved=0" target="_blank">Microsoft Certified: Azure Developer – Associate Exam AZ-204
- https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Facloudguru.com%2Fblog%2Fengineering%2Fwhich-azure-certification-is-right-for-me%23devops-engineer&data=05%7C01%7C%7C42a09b32ee3d4f9684fc08dab717d4cb%7Ce9cb3c8041564c39a7fe68fe427a3d46%7C1%7C0%7C638023611149647967%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000%7C%7C%7C&sdata=Fz7BGlfqUTEn7NC6zV4VGnmK%2FfnmJimNzWM5r%2FK%2FVxs%3D&reserved=0" target="_blank">Microsoft Certified: DevOps Engineer Expert (AZ-400)
- https://ind01.safelinks.protection.outlook.com/?url=https%3A%2F%2Facloudguru.com%2Fblog%2Fengineering%2Fwhich-azure-certification-is-right-for-me%23solutions-architect&data=05%7C01%7C%7C42a09b32ee3d4f9684fc08dab717d4cb%7Ce9cb3c8041564c39a7fe68fe427a3d46%7C1%7C0%7C638023611149647967%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C1000%7C%7C%7C&sdata=fks6seKESNpwOsrBcAWq8l0Yen6nLnnvExp7Ufgt%2Fps%3D&reserved=0" target="_blank">Microsoft Certified: Azure Solutions Architect Expert (AZ-305)
- Good understanding of software architecture, scalability, resilience, performance;
- Working knowledge of automation tools such as Azure DevOps, Azure Pipeline or Jenkins or similar
Roles & Responsibilities
- Defining best practices & standards for usage of libraries, frameworks and other tools being used;
- Architecture, design, and implementation of software from development, delivery, and releases.
- Breakdown complex requirements into independent architectural components, modules, tasks and strategies and collaborate with peer leadership through the full software development lifecycle to deliver top quality, on time and within budget.
- Demonstrate excellent communications with stakeholders regarding delivery goals, objectives, deliverables, plans and status throughout the software development lifecycle.
- Should be able to work with various stakeholders (Architects/Product Owners/Leadership) as well as team - Lead/ Principal/ Individual Contributor for Web UI/ Front End Development;
- Should be able to work in an agile, dynamic team environment;
consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry
Job Responsibilities:
- Technically sound in Dot Net technology. Good working knowledge & experience in Web API and SQL Server
- Should be able to carry out requirement analysis, design, coding unit testing and support to fix defects reported during QA, UAT phases and at GO Live times.
- Able to work alone or as part of a team with minimal or no supervision from Delivery leads.
- Good experience required in Azure stack of integration technology-Logic app, Azure Function, APIM and Application insights.
Must have skill
- Strong Web API development using ASP.Net Core, Logic app, azure functions, APIM
- Azure Functions
- Azure Logic App
- Azure APIM
- Azure ServiceBus
Desirable Skills
- Azure Event Grid/Hub
- Azure KeyVault
- Azure SQL – Knowledge on SQL query
• Architect, develop and maintain highly scalable, reliable and secure distributed backend
systems on Cloud (AWS or Azure) or on-premises environments for Mihup, its customers
and partners.
• Work closely with your fellow engineers to develop systems capable of concurrently
processing massive amounts of voice data both in online real-time as well as offline
environments.
• Drive accountability for test driven development, delivery of high-quality features and
resilient enterprise class solutions.
• Lead a culture of team ownership and direct individual and team accountability to
continuously improve how they work to achieve results.
• Work in a startup environment, pushing boundaries with deep involvement with business.
Requirements (what we are looking for)
The right person is better than the right set of experiences and these are the traits we’ve
identified make great additions to our team.
• BE//BTech/ME/MTech in Computer Science or a related field from a Tier I or Tier II
University.
• 6-8 years of hands-on software development and deployment experience (experience of
working in a consumer product startup during its growth phase will be a plus) of which 1
– 2 years of experience would be in leading a team of software developers.
• Expertise in Java 8+ and Spring Boot is a must
• Good experience in messaging platforms like RabbitMQ/Kafka
• Good experience in distributed systems and relational & NoSQL databases like
PostgreSQL, MySQL, Redis, MongoDB, etc.
• Expertise in one or more of: Java, NodeJS, GoLang, Python would be preferred
• Must have hands on experience in products which handle multiple concurrent calls (API),
handle CPU intensive jobs, consist of multiple asynchronous system calls.
• Worked on microservices based architecture and design
• Managed/ Owned infrastructure on AWS/Microsoft Azure/private clouds and setup high
availability systems
• Knowledge of Docker and Kubernetes
• Knowledge of RESTful APIs, caching concepts, the HTTP protocol and general web
architecture
Pluses
1. Anything that will let us know more about who you are that you would like to share such
as Blogs, Twitter, Medium, GitHub etc.
XpressBees – a logistics company started in 2015 – is amongst the fastest growing
companies of its sector. While we started off rather humbly in the space of
ecommerce B2C logistics, the last 5 years have seen us steadily progress towards
expanding our presence. Our vision to evolve into a strong full-service logistics
organization reflects itself in our new lines of business like 3PL, B2B Xpress and cross
border operations. Our strong domain expertise and constant focus on meaningful
innovation have helped us rapidly evolve as the most trusted logistics partner of
India. We have progressively carved our way towards best-in-class technology
platforms, an extensive network reach, and a seamless last mile management
system. While on this aggressive growth path, we seek to become the one-stop-shop
for end-to-end logistics solutions. Our big focus areas for the very near future
include strengthening our presence as service providers of choice and leveraging the
power of technology to improve efficiencies for our clients.
Job Profile
As a Lead Data Engineer in the Data Platform Team at XpressBees, you will build the data platform
and infrastructure to support high quality and agile decision-making in our supply chain and logistics
workflows.
You will define the way we collect and operationalize data (structured / unstructured), and
build production pipelines for our machine learning models, and (RT, NRT, Batch) reporting &
dashboarding requirements. As a Senior Data Engineer in the XB Data Platform Team, you will use
your experience with modern cloud and data frameworks to build products (with storage and serving
systems)
that drive optimisation and resilience in the supply chain via data visibility, intelligent decision making,
insights, anomaly detection and prediction.
What You Will Do
• Design and develop data platform and data pipelines for reporting, dashboarding and
machine learning models. These pipelines would productionize machine learning models
and integrate with agent review tools.
• Meet the data completeness, correction and freshness requirements.
• Evaluate and identify the data store and data streaming technology choices.
• Lead the design of the logical model and implement the physical model to support
business needs. Come up with logical and physical database design across platforms (MPP,
MR, Hive/PIG) which are optimal physical designs for different use cases (structured/semi
structured). Envision & implement the optimal data modelling, physical design,
performance optimization technique/approach required for the problem.
• Support your colleagues by reviewing code and designs.
• Diagnose and solve issues in our existing data pipelines and envision and build their
successors.
Qualifications & Experience relevant for the role
• A bachelor's degree in Computer Science or related field with 6 to 9 years of technology
experience.
• Knowledge of Relational and NoSQL data stores, stream processing and micro-batching to
make technology & design choices.
• Strong experience in System Integration, Application Development, ETL, Data-Platform
projects. Talented across technologies used in the enterprise space.
• Software development experience using:
• Expertise in relational and dimensional modelling
• Exposure across all the SDLC process
• Experience in cloud architecture (AWS)
• Proven track record in keeping existing technical skills and developing new ones, so that
you can make strong contributions to deep architecture discussions around systems and
applications in the cloud ( AWS).
• Characteristics of a forward thinker and self-starter that flourishes with new challenges
and adapts quickly to learning new knowledge
• Ability to work with a cross functional teams of consulting professionals across multiple
projects.
• Knack for helping an organization to understand application architectures and integration
approaches, to architect advanced cloud-based solutions, and to help launch the build-out
of those systems
• Passion for educating, training, designing, and building end-to-end systems.
- 5+ years of experience in a Data Engineering role on cloud environment
- Must have good experience in Scala/PySpark (preferably on data-bricks environment)
- Extensive experience with Transact-SQL.
- Experience in Data-bricks/Spark.
- Strong experience in Dataware house projects
- Expertise in database development projects with ETL processes.
- Manage and maintain data engineering pipelines
- Develop batch processing, streaming and integration solutions
- Experienced in building and operationalizing large-scale enterprise data solutions and applications
- Using one or more of Azure data and analytics services in combination with custom solutions
- Azure Data Lake, Azure SQL DW (Synapse), and SQL Database products or equivalent products from other cloud services providers
- In-depth understanding of data management (e. g. permissions, security, and monitoring).
- Cloud repositories for e.g. Azure GitHub, Git
- Experience in an agile environment (Prefer Azure DevOps).
Good to have
- Manage source data access security
- Automate Azure Data Factory pipelines
- Continuous Integration/Continuous deployment (CICD) pipelines, Source Repositories
- Experience in implementing and maintaining CICD pipelines
- Power BI understanding, Delta Lake house architecture
- Knowledge of software development best practices.
- Excellent analytical and organization skills.
- Effective working in a team as well as working independently.
- Strong written and verbal communication skills.
- Expertise in database development projects and ETL processes.
Responsibilities
- Design, plan and control the implementation of business solutions requests/demands.
- Execution of best practices, design, and codification, guiding the rest of the team in accordance with it.
- Gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements
- Drive complex technical projects from planning through execution
- Perform code review and manage technical debt
- Handling release deployments and production issues
- Coordinate stress tests, stability evaluations, and support for the concurrent processing of specific solutions
- Participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews
Skills
- Degree in Informatics Engineering, Computer Science, or in similar areas
- Minimum of 5+ years’ work experience in the similar roles
- Expert knowledge in developing cloud-based applications with Java, Spring Boot, Spring Rest, SpringJPA, and Spring Cloud
- Strong understanding of Azure Data Services
- Strong working knowledge of SQL Server, SQL Azure Database, No SQL, Data Modeling, Azure AD, ADFS, Identity & Access Management.
- Hands-on experience in ThingWorx platform (Application development, Mashups creation, Installation of ThingWorx and ThingWorx components)
- Strong knowledge of IoT Platform
- Development experience in Microservices Architectures best practices and, Docker, Kubernetes
- Experience designing /maintaining/tuning high-performance code to ensure optimal performance
- Strong knowledge of web security practice
- Experience working in Agile Development
- Knowledge about Google CloudPlatform and Kubernetes
- Good understanding of Git, source control procedures, and feature branching
- Fluent in English - written and spoken (mandatory)
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools: Experience with big
- data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
* Formulates and recommends standards for achieving maximum performance
and efficiency of the DW ecosystem.
* Participates in the Pre-sales activities for solutions of various customer
problem-statement/situations.
* Develop business cases and ROI for the customer/clients.
* Interview stakeholders and develop BI roadmap for success given project
prioritization
* Evangelize self-service BI and visual discovery while helping to automate any
manual process at the client site.
* Work closely with the Engineering Manager to ensure prioritization of
customer deliverables.
* Champion data quality, integrity, and reliability throughout the organization by
designing and promoting best practices.
*Implementation 20%
* Help DW/DE team members with issues needing technical expertise or
complex systems and/or programming knowledge.
* Provide on-the-job training for new or less experienced team members.
* Develop a technical excellence team
Requirements
- experience designing business intelligence solutions
- experience with ETL Process, Data warehouse architecture
- experience with Azure Data services i.e., ADF, ADLS Gen 2, Azure SQL dB,
Synapse, Azure Databricks, and Power BI
- Good analytical and problem-solving skills
- Fluent in relational database concepts and flat file processing concepts
- Must be knowledgeable in software development lifecycles/methodologies
Envoy combines technology and global immigration services to offer the only immigration management platform that makes it seamless for companies to hire and manage an international workforce. We empower companies to acquire the best talent regardless of where they are in the world; helps mobilize employees around the world to take advantage of business opportunities; and enables the management of entire global workforces, providing a strategic, proactive view into workforce and financial forecasting and compliance. We are a fast-growing, award-winning technology company, a leader in our space, and are backed by some of the country’s leading venture capital and growth equity firms.
Our Engineering Team
We have a passionate product engineering team that works on complex technical challenges, employs creativity, and constantly learns a variety of frameworks, tools, and technologies. While we are dedicated to delivering excellent product, we also believe having fun along the way motivates us to pour our heart into what we do. Our team has mastered development and delivery process, allowing the team to focus on designing, architecting, and crafting masterpieces.
We are growing rapidly and expanding our team in India, join our product engineering team to be part of this exciting journey.
"Programming isn't about what you know; it's about what you can figure out.” - Chris Pine
Skills Required
- 8+ years of strong programming experience on .NET platform
- Expertise in C#, ASP.NET Web API / Typescript / Angular (or any front-end framework or passion to learn Angular)
- Hands on experience with SQL
- Azure experience is a plus but not a deal breaker
- Knowledge and experience with HTML, CSS, JavaScript (fundamental building blocks of web development)
- Experience designing high-level and low-level design of the system
- Experience with microservices architecture is a plus
Expectations
- Provide technical guidance to the team and take ownership of delivery
- Responsible to technically groom backlog items providing design, architecture, and implementation details
- Quality is the key driver to successful delivery, ensure highly testable and quality deliverables
- Leverage troubleshooting and analytical skills to analyze issues
- Experience with debugging, performance profiling and optimization
Envoy Global is an equal opportunity employer and will recruit, hire, train and promote into all job levels the most qualified applicants without regard to race, color, religion, sex, national origin, age, disability, ancestry, sexual orientation, gender identification, veteran status, pregnancy, or any other protected classification.
Envoy combines technology and global immigration services to offer the only immigration management platform that makes it seamless for companies to hire and manage an international workforce. We empower companies to acquire the best talent regardless of where they are in the world; helps mobilize employees around the world to take advantage of business opportunities; and enables the management of entire global workforces, providing a strategic, proactive view into workforce and financial forecasting and compliance. We are a fast-growing, award-winning technology company, a leader in our space, and are backed by some of the country’s leading venture capital and growth equity firms.
Our Engineering Team
We have a passionate product engineering team that works on complex technical challenges, employs creativity, and constantly learns a variety of frameworks, tools, and technologies. While we are dedicated to delivering excellent product, we also believe having fun along the way motivates us to pour our heart into what we do. Our team has mastered development and delivery process, allowing the team to focus on designing, architecting, and crafting masterpieces.
We are growing rapidly and expanding our team in India, join our product engineering team to be part of this exciting journey.
“Any fool can write code that a computer can understand. Good programmers write code that humans can understand.” – Martin Fowler
Skills Required
- 5+ years of strong programming experience on .NET platform
- Expertise in C#, ASP.NET Web API / Typescript / Angular (or any front-end framework or passion to learn Angular)
- Hands on experience with SQL
- Azure experience is a plus but not a deal breaker
- Knowledge and experience with Html, CSS, JavaScript
- Experience designing high-level and low-level design of the system
- Experience with microservices architecture is a plus
Expectations
- Contribute to tech grooming backlog items providing design, architecture, and implementation details
- Quality is the key driver to successful delivery, ensure highly testable and quality deliverables
- Leverage troubleshooting and analytical skills to analyze issues
- Experience with debugging, performance profiling and optimization
Envoy Global is an equal opportunity employer and will recruit, hire, train and promote into all job levels the most qualified applicants without regard to race, color, religion, sex, national origin, age, disability, ancestry, sexual orientation, gender identification, veteran status, pregnancy, or any other protected classification.
Skillsets-Azure, Olap, Etl, sql, python, c#
exp range - 3+ to 4 years
salary-best in industry
notice period - Currently serving notice period (Immediate joiners are preferred)
location- remote work
job type -permanent role
it is full time and totally remote based
Note: For the interview 3 rounds are there -technical round, manager/client round, hr round
- Azure Data Factory, Azure Data Bricks, Talend, BODS, Jenkins
- Microsoft Office (mandatory)
- Strong knowledge on Databases, Azure Synapse, data management, SQL
- Knowledge on any cloud platforms (Azure, AWS etc.,)
- Azure Data Factory, Azure Data Bricks, Talend, BODS, Jenkins
- Microsoft Office (mandatory)
- Strong knowledge on Databases, Azure Synapse, data management, SQL
- Knowledge on any cloud platforms (Azure, AWS etc.,)
Mandatory:
● A minimum of 1 year of development, system design or engineering experience ●
Excellent social, communication, and technical skills
● In-depth knowledge of Linux systems
● Development experience in at least two of the following languages: Php, Go, Python,
JavaScript, C/C++, Bash
● In depth knowledge of web servers (Apache, NgNix preferred)
● Strong in using DevOps tools - Ansible, Jenkins, Docker, ELK
● Knowledge to use APM tools, NewRelic is preferred
● Ability to learn quickly, master our existing systems and identify areas of improvement
● Self-starter that enjoys and takes pride in the engineering work of their team ● Tried
and Tested Real-world Cloud Computing experience - AWS/ GCP/ Azure ● Strong
Understanding of Resilient Systems design
● Experience in Network Design and Management
o Strong Python development skills, with 7+ yrs. experience with SQL.
o A bachelor or master’s degree in Computer Science or related areas
o8+ years of experience in data integration and pipeline development
o Experience in Implementing Databricks Delta lake and data lake
o Expertise designing and implementing data pipelines using modern data engineering approach and tools: SQL, Python, Delta Lake, Databricks, Snowflake Spark
o Experience in working with multiple file formats (Parque, Avro, Delta Lake) & API
o experience with AWS Cloud on data integration with S3.
o Hands on Development experience with Python and/or Scala.
o Experience with SQL and NoSQL databases.
o Experience in using data modeling techniques and tools (focused on Dimensional design)
o Experience with micro-service architecture using Docker and Kubernetes
o Have experience working with one or more of the public cloud providers i.e. AWS, Azure or GCP
o Experience in effectively presenting and summarizing complex data to diverse audiences through visualizations and other means
o Excellent verbal and written communications skills and strong leadership capabilities
Skills:
Python
Experience with Agile development and software such as Azure DevOps or JIRA. Product
Owner certification is a plus
Experience with global teams
Bachelors required. CS degree preferred
Location Delhi NCR
Opening Immediate
Search Context
Over 1.8 billion non-salaried informal sector workers globally, and roughly 700m Indians are
not eligible for pension or other social protection benefits. Without an urgent and effective
response to pension exclusion, they face the grim prospect of extreme poverty for over 20
years once they are too old to work.
pinBox is the only global pensionTech committed exclusively to mass-scale digital micropension inclusion among self-employed women and youth. We deploy our white-labelled,
API-enabled pension administration and delivery platform, our unique deployment model and
a simple and intuitive UI/UX to make access to regulated pension, savings and insurance
products easy and simple for non-salaried informal sector workers. We're working actively
with governments, regulators, multilateral aid agencies and leading financial inclusion
stakeholders in Asia and Africa. The pinBox model is already operating in Rwanda, Kenya
and India. We will expand to Bangladesh, Uganda, Chile, Indonesia and Nigeria by 2023.
Governments and pension regulators use the our pensionTech to jumpstart digital micropension and insurance inclusion among informal sector workers. Pension funds and insurers
use our pensionTech to build a mass market for their products beyond their traditional agentled customer base. Banks, MNOs, cooperatives, MFIs, fintech firms and gig-platforms use
our plug-and-play pensionTech to instantly offer an integrated social protection solution to
their clients, members and employees without any new investments in IT or capacity
enhancement.
We’ve recently completed our first equity fundraise to enhance our engineering, business
and delivery capacity and embark on the next stage of pinBox pensionTech development
and expansion. By 2025, we aim to enable and assist 100 million excluded individuals to
start saving for their old age in a secure, affordable and well-regulated environment.
pinBox is looking for senior software engineers who are deeply passionate about using IT to
solve difficult, real-life problems at scale across multiple countries.
The Senior Software Engineer will be expected to
1. Design, code, test, deploy and maintain applications to satisfy business requirements,
2. Plan and implement technical efforts through all phases of the software development
process,
3. Collaborate cross-functionally to make continuous improvements to the pinBox
pensionTech platform,
4. Help drive engineering plans through a broad approach to engineering quality
(consistent and thoughtful patterns, improved observability, unit and integration testing,
etc.),
5. Adhere to national and global architecture standards, risk management and security
policies,
6. Monitor the performance of applications and work with developers to continuously
improve and optimize performance.
The ideal candidate processes
1. An undergraduate degree in engineering,
2. At least 6 years’ experience as a software engineer or in a similar role,
3. Experience working with distributed version control systems such as Git / Mercurial
4. Frontend: Experience with HTML, CSS, bootstrap, Javascript, Jquery is necessary.
Experience with React / Angular will be an advantage,
5. Backend: Experience with Django/Python, PostgreSQL or any other RDBMS is
mandatory. Experience with Redis will be an advantage,
6. Experience in working with AWS / Azure / Google Cloud,
7. As our applications use a number of third party micro-services, experience with REST
APIs, as also with the Indian digital finance ecosystem (UPI, e-KYC) will be both
necessary and an advantage,
8. Critical thinking and problem-solving skills, and
9. Excellent teamwork and interpersonal skills, a keen eye for detail and the ability to
function effectively and proactively under tight deadlines.