50+ SQL Jobs in Bangalore (Bengaluru) | SQL Job openings in Bangalore (Bengaluru)
Apply to 50+ SQL Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest SQL Job opportunities across top companies like Google, Amazon & Adobe.
Real-Time Marketing Automation Built on Customer Data Platform (CDP) for Enterprises
Qualifications:
- 1-3 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from Premier Institutes
Skill Set:
- Experience in Java, Spring Boot.
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
- Good knowledge of Databases - SQL, NoSQL, MySQL, PostgreSQL
- Experience in Kafka
- Experience in MongoDB, AWS
- Knowledge of Unit Testing a plus
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain
- Excellent written and verbal communication skills
- Active & contributing team member
- Strong work ethic with demonstrated ability to meet and exceed commitments
- Others: Experience of having worked in a start-up is a plus
Key Responsibilities:
- Design and Develop large scale sub-systems
- To periodically explore latest technologies (esp Open Source) and prototype sub-systems
- Be a part of the team that develops the next-gen Targeting platform
- Build components to make the customer data platform more efficient and scalable
Context for the Candidates:
Blockchain engineers are tasked with building secure, scalable, and high-performance decentralized applications. This discussion focuses on how to utilize key technologies—Node.js, Rust, Go, TypeScript, Ethers.js, viem, and Hardhat—in smart contract and blockchain system development, while addressing the practical challenges that arise in such projects.
Key Discussion Points:
• How do you integrate Node.js, Rust, Go, and TypeScript in blockchain-backed applications?
• Discuss the trade-offs between using Rust or Go for performance-critical blockchain services.
• What are the key challenges in ensuring the security and scalability of smart contracts using tools like Ethers.js, viem, and Hardhat?
• How do you approach testing, debugging, and deploying smart contracts in a decentralized environment?
• Share best practices for gas optimization, contract upgradability, and backend scalability.
Evaluation Criteria:
1. Technical Knowledge:
• Strong knowledge of Node.js, Rust, Go, and TypeScript with practical experience in blockchain development.
• Proficiency in Ethers.js, viem, Hardhat, and smart contract lifecycle (development, testing, deployment).
• Understanding of security best practices, performance optimizations, and scalability in decentralized systems.
2. Problem-Solving and Integration Skills:
• How candidates approach integrating smart contract solutions with backend services using Node.js, Rust, Go, and TypeScript.
• Solutions proposed for common challenges in blockchain projects such as gas fees, security vulnerabilities, and system bottlenecks.
3. Experience and Expertise:
• Minimum of 2 years of IT experience, including hands-on development with the specified technologies.
• Practical knowledge of blockchain architecture, consensus mechanisms, and decentralized application deployment.
4. Innovation and Critical Thinking:
• Candidates’ ability to think creatively about system architecture, proposing scalable and secure solutions for blockchain-based applications.
• Discussion on the future potential of blockchain technology and how the tech stack can evolve.
5. Communication and Team Collaboration:
• Clear articulation of technical challenges and how to address them in a team setting.
• Ability to lead or contribute to discussions in a way that encourages collaboration and team-driven problem-solving.
Building the machine learning production (or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-of-the-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop ML pipelines to support
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Develop MLOps components in Machine learning development life cycle using Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 3-5 years experience building production-quality software.
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent
- Strong experience in System Integration, Application Development or Data Warehouse projects across technologies used in the enterprise space
- Knowledge of MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- CI/CD experience( i.e. Jenkins, Git hub action,
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Building the machine learning production System(or MLOps) is the biggest challenge most large companies currently have in making the transition to becoming an AI-driven organization. This position is an opportunity for an experienced, server-side developer to build expertise in this exciting new frontier. You will be part of a team deploying state-ofthe-art AI solutions for Fractal clients.
Responsibilities
As MLOps Engineer, you will work collaboratively with Data Scientists and Data engineers to deploy and operate advanced analytics machine learning models. You’ll help automate and streamline Model development and Model operations. You’ll build and maintain tools for deployment, monitoring, and operations. You’ll also troubleshoot and resolve issues in development, testing, and production environments.
- Enable Model tracking, model experimentation, Model automation
- Develop scalable ML pipelines
- Develop MLOps components in Machine learning development life cycle using Model Repository (either of): MLFlow, Kubeflow Model Registry
- Machine Learning Services (either of): Kubeflow, DataRobot, HopsWorks, Dataiku or any relevant ML E2E PaaS/SaaS
- Work across all phases of Model development life cycle to build MLOPS components
- Build the knowledge base required to deliver increasingly complex MLOPS projects on Azure
- Be an integral part of client business development and delivery engagements across multiple domains
Required Qualifications
- 5.5-9 years experience building production-quality software
- B.E/B.Tech/M.Tech in Computer Science or related technical degree OR equivalent
- Strong experience in System Integration, Application Development or Datawarehouse projects across technologies used in the enterprise space
- Expertise in MLOps, machine learning and docker
- Object-oriented languages (e.g. Python, PySpark, Java, C#, C++)
- Experience developing CI/CD components for production ready ML pipeline.
- Database programming using any flavors of SQL
- Knowledge of Git for Source code management
- Ability to collaborate effectively with highly technical resources in a fast-paced environment
- Ability to solve complex challenges/problems and rapidly deliver innovative solutions
- Team handling, problem solving, project management and communication skills & creative thinking
- Foundational Knowledge of Cloud Computing on Azure
- Hunger and passion for learning new skills
Client based at Bangalore location.
Key Responsibilities:
· Develop and maintain VBA applications and Microsoft Access databases to
· automate workflows and improve business processes.
· Design efficient data models, queries, forms, reports, and user-friendly interfaces
· that align with project requirements.
· Troubleshoot and resolve application issues, ensuring high performance, data
· integrity, and security.
· Collaborate with teams to understand functional requirements and implement
· solutions effectively.
· Provide detailed documentation, including code comments, technical specifications,
· and user manuals.
· Conduct testing and debugging to ensure applications are reliable and meet end-client
· expectations.
· Support end-users by offering training, resolving queries, and implementing
· enhancements as needed.
Skills and Qualifications:
· 3-5 years of proven experience in VBA development with a strong focus on
· Microsoft Access.
· Proficiency in Visual Basic for Applications (VBA), Microsoft Access, and SQL.
· Solid experience in database design, including tables, queries, and form/report
· creation within Access.
· Strong problem-solving and debugging skills, with attention to detail and ability to
· work independently.
· Excellent communication skills to effectively collaborate with teams and document
Preferred Skills:
· Knowledge of Microsoft Excel Macros and Power BI is a plus.
· Prior experience with projects for USA-based clients.
· Familiarity with tools like JIRA, Trello, or similar for project tracking.
VB.NET Developer (5+ Years Experience)
Job Description
We are seeking a highly skilled VB.NET developer with 5+ years of experience to join our development team. The ideal candidate will have a strong foundation in VB.NET and the .NET Framework, along with a proven track record of designing, developing, and deploying scalable and maintainable software solutions.
Responsibilities:
- Design, develop, and implement complex applications using VB.NET and the .NET Framework.
- Collaborate with designers, product managers, and other stakeholders to understand requirements and translate them into functional and user-friendly applications.
- Write clean, efficient, and well-documented code that adheres to best practices and coding standards.
- Develop and maintain web applications using ASP.NET (Web Forms or MVC).
- Build and interact with databases using ADO.NET.
- Create and implement reusable components and libraries.
- Troubleshoot and debug complex technical issues.
- Participate in code reviews and provide constructive feedback.
- Stay up-to-date with the latest trends and technologies in VB.NET and the .NET Framework.
- Contribute to the continuous improvement of our development processes and tools.
Required Skills and Experience:
- 5+ years of experience as a VB.NET developer.
- In-depth knowledge of VB.NET syntax, data types, control flow, and object-oriented programming (OOP) concepts.
- Strong understanding of the .NET Framework, including the CLR, CLI, and BCL.
- Experience with ASP.NET (Web Forms or MVC) for building web applications.
- Proficiency in ADO.NET for database access.
- Experience with Windows Forms for desktop application development (a plus).
- Familiarity with web services (SOAP and REST) and their integration.
- Understanding of XML and its use in data exchange and configuration.
- Knowledge of SQL for interacting with relational databases.
Preferred Skills:
- Experience with design patterns for writing maintainable code.
- Proficiency in version control systems like Git or SVN.
- Experience with unit testing frameworks like NUnit or MSTest.
- Understanding of cloud platforms (Azure, AWS) is a plus.
- Knowledge of API development and integration.
- Experience with security best practices and vulnerability mitigation techniques.
- Excellent communication and interpersonal skills.
- Ability to work independently and as part of a team.
Benefits:
We offer a competitive salary and benefits package, along with the opportunity to work on challenging and rewarding projects.
If you are a talented and passionate VB.NET developer with a proven track record of success, we encourage you to apply!
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 6+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Key Responsibilities
- Design, develop, and optimize data pipelines using Apache Spark to process large volumes of structured and unstructured data.
- Write efficient and maintainable code in Scala and Python for data extraction, transformation, and loading (ETL) operations.
- Collaborate with cross-functional teams to define data engineering solutions to support analytics and machine learning initiatives.
- Implement and maintain data lake and warehouse solutions using cloud platforms (e.g., AWS, GCP, Azure).
- Ensure data workflows and distributed systems' performance, scalability, and reliability.
- Perform data quality assessments, implement monitoring, and improve data governance practices.
- Assist in migrating and refactoring legacy data systems into modern distributed data processing platforms.
- Provide technical leadership and mentorship to junior engineers and contribute to best practices in coding, testing, and deployment.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Engineering, or a related field.
- 8+ years of hands-on experience in data engineering, with strong skills in Apache Spark, Scala, and Python.
- Experience with distributed data processing frameworks and real-time data processing.
- Strong experience with big data technologies such as Hadoop, Hive, and Kafka.
- Proficient with relational databases (SQL, PostgreSQL, MySQL) and NoSQL databases (Cassandra, HBase, MongoDB).
- Knowledge of CI/CD pipelines and DevOps practices for deploying data workflows.
- Strong problem-solving skills and experience with optimizing large-scale data systems.
- Excellent communication and collaboration skills.
- Experience with orchestration tools like Airflow
- Experience with containerization and orchestration (e.g., Docker, Kubernetes)
Job Description for Data Engineer Role:-
Must have:
Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.
Experience working with Databases and SQL optimizations
Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.
Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward.
Role and Responsibilities:
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
- Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
- Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
- Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards
Qualifications -
- Experience with SQL and NoSQL databases.
- Experience with cloud platforms, preferably AWS.
- Strong experience with data warehousing and data lake technologies (Snowflake)
- Expertise in data modelling
- Experience with ETL/LT tools and methodologies
- Experience working on real-time Data Streaming and Data Streaming platforms
- 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
- 2+ years working with SQL and NoSQL databases, data modeling and data management
- 2+ years of experience with AWS, GCP, Azure, or another cloud service.
As a Senior Analyst you will play a crucial role in improving customer experience, retention, and growth by identifying opportunities in the moment that matter and surfacing them to teams across functions. You have experience with collecting, stitching and analyzing Voice of Customer and CX data, and are passionate about customer feedback. You will partner with the other Analysts on the team, as well as the Directors, in delivering impactful presentations that drive action across the organization.
You come with an understanding of customer feedback and experience analytics and the ability to tell a story with data. You are a go-getter who can take a request and run with it independently but also does not shy away from collaborating with the other team members. You are flexible and thrive in a fast-paced environment with changing priorities.
Responsibilities
- Stitch and analyze data from primary/secondary sources to determine key drivers of customer success, loyalty, risk, churn and overall experience.
- Verbalize and translate these insights into actionable tasks.
- Develop monthly and ad-hoc reporting.
- Maintain Qualtrics dashboards and surveys.
- Take on ad-hoc tasks related to XM platform onboarding, maintenance or launch of new tools
- Understand data sets within the Data Services Enterprise Data platform and other systems
- Translate user requirements independently
- Work with Business Insights, IT and technical teams
- Create PowerPoint decks and present insights to stakeholders
Qualifications
- Bachelor’s degree in data science/Analytics, Statistics or Business. Master’s degree is a plus.
- Extract data and customer insights, analyze audio or speech, and present them in any visualization tool.
- 5+ years of experience in an analytical, customer insights or related business function.
- Basic knowledge of SQL and Google Big Query.
- 1+ year of experience working with Qualtrics, Medallia or another Experience Management platform
- Hands-on experience with statistical techniques: profiling, regression analysis, trend analysis, segmentation
- Well-organized and high energy
Non-technical requirements:
- You have experience working on client-facing roles.
- You are available to work from our Bangalore office from Day 1 in a night shift. (US Shift)
- You have strong communication skills.
- You have strong analytical skills.
Job Description for QA Engineer:
- 6-10 years of experience in ETL Testing, Snowflake, DWH Concepts.
- Strong SQL knowledge & debugging skills are a must.
- Experience on Azure and Snowflake Testing is plus
- Experience with Qlik Replicate and Compose tools (Change Data Capture) tools is considered a plus
- Strong Data warehousing Concepts, ETL tools like Talend Cloud Data Integration, Pentaho/Kettle tool
- Experience in JIRA, Xray defect management toolis good to have.
- Exposure to the financial domain knowledge is considered a plus
- Testing the data-readiness (data quality) address code or data issues
- Demonstrated ability to rationalize problems and use judgment and innovation to define clear and concise solutions
- Demonstrate strong collaborative experience across regions (APAC, EMEA and NA) to effectively and efficiently identify root cause of code/data issues and come up with a permanent solution
- Prior experience with State Street and Charles River Development (CRD) considered a plus
- Experience in tools such as PowerPoint, Excel, SQL
- Exposure to Third party data providers such as Bloomberg, Reuters, MSCI and other Rating agencies is a plus
Key Attributes include:
- Team player with professional and positive approach
- Creative, innovative and able to think outside of the box
- Strong attention to detail during root cause analysis and defect issue resolution
- Self-motivated & self-sufficient
- Effective communicator both written and verbal
- Brings a high level of energy with enthusiasm to generate excitement and motivate the team
- Able to work under pressure with tight deadlines and/or multiple projects
- Experience in negotiation and conflict resolution
Overview
We're looking for a Mid-to senior level data analyst who can create underwriting methodologies based on users' financial data, assist with regular evaluation and reporting, and help us in continuous product improvement. The ideal candidate should have previous experience in a good product-based startup who can be a go-getter, and has helped the product to launch different functionalities, and has experience stabilizing it.
Roles and Responsibilities
- Analytical Skills, Data Analytics, and Statistics
- Strong communication skills
- Proficiency in SQL, Python, and Excel is mandatory.
- Attention to detail and ability to work with large datasets
- Degree in Mathematics, Statistics, Computer Science, or related field
- Relevant work experience in the finance industry is a plus
- Strong proficiency in MS Excel and knowledge of data visualization tools such as Redash.
- ML/Statistical model or algorithm is a bonus
- A good understanding of the product life cycle and product-based analytical experience is a must
- The ability to communicate with cross-functional teams and help the stakeholders with the right insights is a must
- Ability to thrive in a fast-paced environment of start-up is a must
Client based at Bangalore and Gurugram location.
Senior QA Engineer
Experience - 3 to 5 Years
Mandatory Skills- Manual Testing, Sql, ETL Processes, Data Pipelines, Data Warehousing, Geo-Spatial Data
Additional Skills - Communication Skills, Performance Testing, Automation Testing
Locations-Bengaluru/ Gurugram
Job Description
We are looking for a skilled QA / Data Engineer with 3-5 years of experience. The ideal candidate will have strong expertise in manual testing and SQL, with additional knowledge in automation and performance testing being highly desirable. You will be responsible for ensuring the quality and reliability of our data-driven applications by performing thorough testing and validation.
Must Have Skills:
Extensive experience in manual testing, particularly in data-centric environments.
Strong SQL skills for data validation, querying, and testing of database functionalities.
Experience in data engineering concepts, including ETL processes, data pipelines, and data warehousing.
Experience in Geo-Spatial data.
Solid understanding of QA methodologies and best practices for both software and data testing.
Good communication skills.
Good To Have Skills:
Experience with automation testing tools and frameworks (e.g., Selenium, JUnit) for data pipelines.
Knowledge of performance testing tools (e.g., JMeter, LoadRunner) for evaluating data systems.
Familiarity with data engineering tools and platforms (e.g., Apache Kafka, Apache Spark, Hadoop).
Understanding of cloud-based data solutions (e.g., AWS, Azure, Google Cloud) and their testing methodologies.
Required Qualification
Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.)
Join an innovative and groundbreaking cybersecurity startup focused on helping customers identify, mitigate, and protect against ever-evolving cyber threats. With the current geopolitical climate, organizations need to stay ahead of malicious threat actors as well as nation-state actors. Cybersecurity teams are getting overwhelmed, and they need intelligent systems to help them focus on addressing the biggest and current risks first.
We help organizations protect their assets and customer data by continuously evaluating the new threats and risks to their cloud environment. This will, in turn, help mitigate the high-priority threats quickly so that the engineers can spend more time innovating and providing value to their customers.
About the Engineering Team:
We have several decades of experience working in the security industry, having worked on some of the most cutting-edge security technology that helped protect millions of customers. We have built technologies from the ground up, partnered with the industry on innovation, and helped customers with some of the most stringent requirements. We leverage industry and academic experts and veterans for their unique insight. Security technology includes all facets of software engineering work from data analytics and visualization, AI/ML processing, highly distributed and available services with real-time monitoring, integration with various other services, including protocol-level work. You will be learning from some of the best engineering talent with multi-cloud expertise.
We are looking for a highly experienced Principal Software Engineer to lead the development and scaling of our backend systems. The ideal candidate will have extensive experience in distributed systems, database management, Kubernetes, and cloud technologies. As a key technical leader, you will design, implement, and optimize critical backend services, working closely with cross-functional teams to ensure system reliability, scalability, and performance.
Key Responsibilities:
- Architect and Develop Distributed Systems: Design and implement scalable, distributed systems using microservices architecture. Expertise in both synchronous (REST/gRPC) and asynchronous communication patterns (message queues, Kafka), with a strong emphasis on building resilient services that can handle large data and maintain high throughput. Craft cloud solutions tailored to specific needs, choosing appropriate AWS services and optimizing resource utilization to ensure performance and high availability.
- Database Architecture & Optimization: Lead efforts to design and manage databases with a focus on scaling, replication, query optimization, and managing large datasets.
- Performance & Reliability: Engage in continuous learning and innovation to improve customer satisfaction. Embrace accountability and respond promptly to service issues to maintain and enhance system health. Ensure the backend systems meet high standards for performance, reliability, and scalability, identifying and solving bottlenecks and architectural challenges by leveraging various observability tools (such as Prometheus and Grafana).
- Leadership & Mentorship: Provide technical leadership and mentorship to other engineers, guiding architecture decisions, reviewing code, and helping to build a strong engineering culture. Stay abreast of the latest industry trends in cloud technology, adopting best practices to continuously improve our services and security measures.
Key Qualifications:
- Experience: 10+ years of experience in backend engineering, with at least 5 years of experience in building distributed systems.
- Technical Expertise:
- Distributed Systems: Extensive experience with microservices architecture, working with both synchronous (REST, gRPC) and asynchronous patterns (SNS, SNQ). Strong understanding of service-to-service authentication and authorization, API rate limiting, and other critical aspects of scalable systems.
- Database: Expertise in database technologies with experience working with large datasets, optimizing queries, handling replication, and creating views for performance. Hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, Cassandra). Expertise in various database technologies and deep experience with creating data models that provide consistent data views to the customer while data is being morphed, handling data migrations, and ensuring data integrity and high availability.
- Kubernetes: In-depth knowledge of Kubernetes, with experience deploying and managing services in Kubernetes clusters (EKS, AKS). Strong understanding of pods, services, networking, and scaling applications within Kubernetes environments.
- Golang: Proven experience using Golang as the primary programming language for backend development. Deep understanding of concurrency, performance optimization, and scalability in Golang applications.
- Cloud Technologies: Strong hands-on experience with AWS services (EC2, S3, DynamoDB, Lambda, RDS, EKS). Experience in designing and optimizing cloud-based architectures for large-scale distributed systems.
- Problem Solver: Strong problem-solving and debugging skills, with a proven ability to design and optimize complex systems.
- Leadership: Experience in leading engineering teams, guiding architectural decisions, and mentoring junior engineers.
Preferred Skills:
- Experience with infrastructure as code (Terraform, CloudFormation).
- Knowledge of GitHub-based CI/CD tools and best practices.
- Experience with monitoring and logging tools (Prometheus, Grafana, ELK).
- Cybersecurity experience.
is a leading Data & Analytics intelligence technology solutions provider to companies that value insights from information as a competitive advantage. We are the partner of choice for enterprises on their digital transformation journey. Our teams offer solutions and services at the intersection of Advanced Data, Analytics, and AI.
Skills: ITSM methodologies, Python, Snowflake and AWS. Open for 18*5 support as well.
Immediate Joiner - 30 days NP
• Bachelor’s degree in computer science, Software Engineering, or a related field.
• Should have hands on 5+ Experience in ITSM methodologies
• 3+ Years of experience in SQL, Snowflake, Python development
• 2+ years hands-on experience in Snowflake DW
• Good communication and client/stakeholders’ management skill
• Willing to work across multiple time-zone and handled team based out of off - shore
Job Description: AI/ML Engineer
Location: Bangalore (On-site)
Experience: 2+ years of relevant experience
About the Role:
We are seeking a skilled and passionate AI/ML Engineer to join our team in Bangalore. The ideal candidate will have over two years of experience in developing, deploying, and maintaining AI and machine learning models. As an AI/ML Engineer, you will work closely with our data science team to build innovative solutions and deploy them in a production environmen
Key Responsibilities:
- Develop, implement, and optimize machine learning models.
- Perform data manipulation, exploration, and analysis to derive actionable insights.
- Use advanced computer vision techniques, including YOLO and other state-of-the-art methods, for image processing and analysis.
- Collaborate with software developers and data scientists to integrate AI/ML solutions into the company's applications and products.
- Design, test, and deploy scalable machine learning solutions using TensorFlow, OpenCV, and other related technologies.
- Ensure the efficient storage and retrieval of data using SQL and data manipulation libraries such as pandas and NumPy.
- Contribute to the development of backend services using Flask or Django for deploying AI models.
- Manage code using Git and containerize applications using Docker when necessary.
- Stay updated with the latest advancements in AI/ML and integrate them into existing projects.
Required Skills:
- Proficiency in Python and its associated libraries (NumPy, pandas).
- Hands-on experience with TensorFlow for building and training machine learning models.
- Strong knowledge of linear algebra and data augmentation techniques.
- Experience with computer vision libraries like OpenCV and frameworks like YOLO.
- Proficiency in SQL for database management and data extraction.
- Experience with Flask for backend development.
- Familiarity with version control using Git.
Optional Skills:
- Experience with PyTorch, Scikit-learn, and Docker.
- Familiarity with Django for web development.
- Knowledge of GPU programming using CuPy and CUDA.
- Understanding of parallel processing techniques.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Demonstrated experience in AI/ML, with a portfolio of past projects.
- Strong analytical and problem-solving skills.
- Excellent communication and teamwork skills.
Why Join Us?
- Opportunity to work on cutting-edge AI/ML projects.
- Collaborative and dynamic work environment.
- Competitive salary and benefits.
- Professional growth and development opportunities.
If you're excited about using AI/ML to solve real-world problems and have a strong technical background, we'd love to hear from you!
Apply now to join our growing team and make a significant impact!
Real-time marketing automation built on an intelligent & secure Customer Data Platform increases conversions, retention & growth for enterprises.
Responsibilities:
- Design and Develop large scale sub-systems
- To periodically explore latest technologies (esp Open Source) and prototype sub-systems
- Be a part of the team that develops the next-gen Targeting platform
- Build components to make the customer data platform more efficient and scalable
Qualifications:
- 0-2 years of relevant experience with Java, Algorithms, Data Structures, & Optimizations in addition to Coding.
- Education: B.E/B-Tech/M-Tech/M.S in Computer Science or IT from premier institutes
Skill Set:
- Good Aptitude/Analytical skills (emphasis will be on Algorithms, Data Structures,& Optimizations in addition to Coding)
- Good knowledge of Databases - SQL, NoSQL
- Knowledge of Unit Testing a plus
Soft Skills:
- Has an appreciation of technology and its ability to create value in the marketing domain
- Excellent written and verbal communication skills
- Active & contributing team member
- Strong work ethic with demonstrated ability to meet and exceed commitments
- Others: Experience of having worked in a start-up is a plus
at REConnect Energy
Work at the Intersection of Energy, Weather & Climate Sciences and Artificial Intelligence
About the company:
REConnect Energy is India's largest tech-enabled service provider in predictive analytics and demand-supply aggregation for the energy sector. We focus on digital intelligence for climate resilience, offering solutions for efficient asset and grid management, minimizing climate-induced risks, and providing real-time visibility of assets and resources.
Responsibilities:
- Design, develop, and maintain data engineering pipelines using Python.
- Implement and optimize database solutions with SQL and NOSQL Databases (MySQL and MongoDB).
- Perform data analysis, profiling, and quality assurance to ensure high service quality standards.
- Troubleshoot and resolve data-pipeline related issues, ensuring optimal performance and reliability.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical specifications.
- Participate in code reviews and contribute to the continuous improvement of the codebase.
- Utilize GitHub for version control and collaboration.
- Implement and manage containerization solutions using Docker.
- Implement tech solutions to new product development, ensuring scalability, performance, and security.
Requirements:
- Bachelors or Master’s degree in Computer Science, Software Engineering, Electrical Engineering or equivalent.
- Proficient in Python programming skills and expertise with data engineering.
- Experience in databases including MySQL and NoSQL.
- Experience in developing and maintaining critical and high availability systems will be given strong preference.
- Experience working with AWS cloud platform.
- Strong analytical and data-driven approach to problem solving.
Node.js Developer / NestJS Developer – Job Description
A Bachelor’s Degree or Master’s Degree in Computer Science is preferred with excellent problem solving skills.
Job Type: Full-time
Job Location: Bengaluru (on site)
Preferred Skills: TypeScript / Nodejs, SQL/ MySQL
Experience: Min 2yrs in similar Role.
Responsibilities:
- Develop and Maintain Server-side Logic: Design, implement, and maintain the server-side logic using Node.js, ensuring high performance and responsiveness to requests from the front-end.
- API Development: Build and maintain RESTful APIs for seamless integration with front-end services and third-party applications.
- Database Management: Work with databases (such as MongoDB, MySQL, PostgreSQL) to ensure data consistency, reliability, and optimal performance.
- Code Quality and Testing: Write clean, maintainable, and efficient code. Implement automated testing platforms and unit tests.
- Collaboration: Work closely with front-end developers, designers, and other team members to define and implement technical solutions that meet business requirements.
- Troubleshooting and Debugging: Identify issues, debug, and resolve bugs and other technical problems in a timely manner.
- Documentation: Create and maintain documentation related to application development, API usage, and system operations.
- Stay Updated: Keep up-to-date with the latest industry trends and technologies to ensure the application remains modern and competitive.
About us
Fisdom is one of the largest wealthtech platforms that allows investors to manage their wealth in an intuitive and seamless manner. Fisdom has a suite of products and services that takes care of every wealth requirement that an individual would have. This includes Mutual Funds, Stock Broking, Private Wealth, Tax Filing, and Pension funds
Fisdom has a B2C app and also an award-winning B2B2C distribution model where we have partnered with 15 of the largest banks in India such as Indian Bank and UCO Bank to provide wealth products to their customers. In our bank-led distribution model, our SDKs are integrated seamlessly into the bank’s mobile banking and internet banking application. Fisdom is the first wealthtech company in the country to launch a stock broking product for customers of a PSU bank.
The company is breaking down barriers by enabling access to wealth management to underserved customers. All our partners combined have a combined user base of more than 50 crore customers. This makes us uniquely placed to disrupt the wealthtech space which we believe is in its infancy in India in terms of wider adoption.
Where are we now and where are we heading towards
Founded by veteran VC-turned entrepreneur Subramanya SV(Subu) and former investment
banker Anand Dalmia, Fisdom is backed by PayU (Naspers), Quona Capital, and Saama Capital; with $37million of total funds raised so far. Fisdom is known for its revenue and profitability focussed approach towards sustainable business.
Fisdom is the No.1 company in India in the B2B2C wealthtech space and one of the most admired companies in the fintech ecosystem for our business model. We look forward to growing the leadership position by staying focussed on product and technology innovation.
Our technology team
Today we are a 60-member strong technology team. Everyone in the team is a hands-on engineer, including the team leads and managers. We take pride in being product engineers and we believe engineers are fundamentally problem solvers first. Our culture binds us together as one cohesive unit. We stress on engineering excellence and strive to become a high talent density team. Some values that we preach and practice include:
- Individual ownership and collective responsibility
- Focus on continuous learning and constant improvement in every aspect of engineering and product
- Cheer for openness, inclusivity and transparency
- Merit-based growth
What we are looking for
- Are open to work in a flat, non-hierarchical setup where daily focus is only shipping features not reporting to managers
- Experience designing highly interactive web applications with performance, scalability, accessibility, usability, design, and security in mind.
- Experience with distributed (multi-tiered) systems, algorithms, and relational and no-sql databases.
- Ability to break-down larger/fuzzier problems into smaller ones in the scope of the product
- Experience with architectural trade-offs, applying synchronous and asynchronous design patterns, and delivering with speed while maintaining quality.
- Raise the bar on sustainable engineering by improving best practices, producing best in class of code, documentation, testing and monitoring.
- Contributes in code and actively takes part in code reviews.
- Working with the Product Owner/managers to clearly define the scope of multiple sprints. Lead/guide the team through sprint(s) scoping, resource allocation and commitment - the execution plan.
- Drives feature development end-to-end. Active partner with product, design, and peer engineering leads and managers.
- Familiarity with build, release, deployment tools such as Ant, Maven, and Gradle, Docker, Kubernetes, Jenkins etc.
- Effective at influencing a culture of engineering craftsmanship and excellence
- Helps the team make the right choices. Drives adoption of engineering best practices and development processes within their team.
- Understanding security and compliance.
- User authentication and authorisation between multiple systems, servers, and environments.
- Based on your experience, you may lead a small team of Engineers.
If you don't have all of these, that's ok. But be excited about learning the few you don't know.
Skills
Microservices, Engineering Management, Quality management, Technical Architecture, technical lead. Hands-on programming experience in one of languages: Python, Golang.
Additional perks
- Access to large repositories of online courses through Myacademy (includes Udemy, Coursera, Harvard ManageMentor, Udacity and many more). We strongly encourage learning something outside of work as a habit.
- Career planning support/counseling / coaching support. Both internal and external coaches.
- Relocation policy
You will not be a good fit for this role if
- you have experience of only working with services companies or have spent a major part of your time there
- you are not open to shifting to new programming language or stack but exploring a position aligned to your current technical experience
- you are not very hands-on, seek direction constantly and need continuous supervision from a manager to finish tasks
- you like to working alone and mentoring junior engineers does not interest you
- you are looking to work in very large teams
Why join us and where?
We're a small but high performing engineering team. We recognize that the work we do impacts the lives of hundreds and thousands of people. Your work will contribute significantly to our mission. We pay competitive compensation and performance bonuses. We provide a high energy work environment and you are encouraged to play around new technology and self-learning. You will be based out of Bangalore
About us
Fisdom is one of the largest wealthtech platforms that allows investors to manage their wealth in an intuitive and seamless manner. Fisdom has a suite of products and services that takes care of every wealth requirement that an individual would have. This includes Mutual Funds, Stock Broking, Private Wealth, Tax Filing, and Pension funds
Fisdom has a B2C app and also an award-winning B2B2C distribution model where we have partnered with 15 of the largest banks in India such as Indian Bank and UCO Bank to provide wealth products to their customers. In our bank-led distribution model, our SDKs are integrated seamlessly into the bank’s mobile banking and internet banking application. Fisdom is the first wealthtech company in the country to launch a stock broking product for customers of a PSU bank.
The company is breaking down barriers by enabling access to wealth management to underserved customers. All our partners combined have a combined user base of more than 50 crore customers. This makes us uniquely placed to disrupt the wealthtech space which we believe is in its infancy in India in terms of wider adoption.
Where are we now and where are we heading towards
Founded by veteran VC-turned entrepreneur Subramanya SV(Subu) and former investment
banker Anand Dalmia, Fisdom is backed by PayU (Naspers), Quona Capital, and Saama Capital; with $37million of total funds raised so far. Fisdom is known for its revenue and profitability focussed approach towards sustainable business.
Fisdom is the No.1 company in India in the B2B2C wealthtech space and one of the most admired companies in the fintech ecosystem for our business model. We look forward to growing the leadership position by staying focussed on product and technology innovation.
Our technology team
Today we are a 60-member strong technology team. Everyone in the team is a hands-on engineer, including the team leads and managers. We take pride in being product engineers and we believe engineers are fundamentally problem solvers first. Our culture binds us together as one cohesive unit. We stress on engineering excellence and strive to become a high talent density team. Some values that we preach and practice include:
- Individual ownership and collective responsibility
- Focus on continuous learning and constant improvement in every aspect of engineering and product
- Cheer for openness, inclusivity and transparency.
- Merit-based growth
Key Responsibilities
- Write code, build prototypes and resolve issues.
- Write and review unit test cases.
- Review code & designs for both oneself and team members
- Defining and building microservices
- Building systems with positive business outcome
- Tracking module health, usage & behaviour tracking.
Key Skills
- An engineer with at least 1-3 years of working experience in web services, preferably in Python
- Must have a penchant for good API design.
- Must be a stickler for good, clear and secure coding.
- Must have built and released APIs in production.
- Experience in working with RDBMS & NoSQL databases.
- Working knowledge of GCP, AWS, Azure or any other cloud provider.
- Aggressive problem diagnosis & creative problem solving skills.
- Communication skills, to speak to developers across the world
Why join us and where?
We're a small but high performing engineering team. We recognize that the work we do impacts the lives of hundreds and thousands of people. Your work will contribute significantly to our mission. We pay competitive compensation and performance bonuses. We provide a high energy work environment and you are encouraged to play around new technology and self-learning. You will be based out of Bangalore.
Roles and Responsibilities:
- Contribute in all phases of the development lifecycle
- Write well designed, testable, efficient code
- Ensure designs comply with specifications
- Prepare and produce releases of software components
- Support continuous improvement by investigating alternatives and technologies and presenting these for architectural review
- Ensure continual knowledge management
- Adherence to the organizational guidelines and processes
Skills /Competencies: a. Bachelor/Master’s degree with good experience in computer programming b.4+ years working experience in application development using Java
Essential Skills:
- Hands on experience in designing and developing applications using Java EE platforms
- Object Oriented analysis and design using common design patterns.
- Profound insight of Java and JEE internals (Data structure, Algorithm and time complexity, Memory Management, Transaction management etc)
- Excellent knowledge of Relational Databases, SQL and ORM technologies (JPA2, Hibernate)
- Experience in the Spring Framework
- Experience in developing web applications using at least one popular web framework (JSF, Wicket, GWT, Spring MVC) and UI technology (Angular/React JS)
- Ability to operate independently while establishing strong working relationships with co-workers and cross-functional teams
- Strong organizational and prioritization skills
- Demonstrate critical attention to detail and deadlines, and are self-motivated
- Ability to adapt to changes in direction and priorities in a project and deadline-oriented environment
- Strong written and verbal English communication skills
- Problem-solving attitude
Preferred skills Good to have –
- Knowledge in any UI technology (Angular, React, JS)
- Intermediate level knowledge of Unix environment (User commands, not System Admin commands)
- Understanding of capital markets and middle/back office processes in the financial services space
Experience: 1-3 years
Location: Bangalore
Notice:Immediate Joiner
ResponsibilitiesTo attend to user issue and handle tickets raised by user
- Strong experience in SQL , PostgreSQL, PL/SQL
- Providing L2 support depending upon the priority of issue to meet client SLA.
- Regularly monitoring S-MAX ticketing tool.
- Incident management- logging, prioritizing and resolving/debugging incidents. Worked on various monitoring tools like open bravo, GCP.
- Writing the SQL queries as per the business need.
- JAVA HTML, CSS Modify existing Shell Script on Unix Platform whenever required. Interact with Global Customers / Users. Scheduling the jobs through Crontab.
- TOOLS Regularly checking the business mail and reply to the clients or the respective team
- Excellent written and verbal communication skills in English with the ability to clearly articulate solutions to complex technical problems Ability to work with Business heads, administrators and developers Excellent time management skills
Total Experience:1-2 Years
Job Type: Contract
Notice: Immediate Joiner
- 1-2 years proven track record of development and design work in the IT industry, preferably in a software product based organization
- Java, springboot, and Microservices
- SQL
- Startup product experience – hustled through various tech, products, stacks
- Having strong experience in Data structures and Algorithms.(Must).
- Good to have experience in Complex Problem solving
Company: Optimum Solutions
About the company: Optimum solutions is a leader in a sheet metal industry, provides sheet metal solutions to sheet metal fabricators with a proven track record of reliable product delivery. Starting from tools through software, machines, we are one stop shop for all your technology needs.
Role Overview:
- Creating and managing database schemas that represent and support business processes, Hands-on experience in any SQL queries and Database server wrt managing deployment.
- Implementing automated testing platforms, unit tests, and CICD Pipeline
- Proficient understanding of code versioning tools, such as GitHub, Bitbucket, ADO
- Understanding of container platform, such as Docker
Job Description
- We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework.
- Your primary focus will be working the Product and Usecase delivery team to do various prompting for different Gen-AI use cases
- You will be responsible for prompting and building use case Pipelines
- Perform the Evaluation of all the Gen-AI features and Usecase pipeline
Position: AI ML Engineer
Location: Chennai (Preference) and Bangalore
Minimum Qualification: Bachelor's degree in computer science, Software Engineering, Data Science, or a related field.
Experience: 4-6 years
CTC: 16.5 - 17 LPA
Employment Type: Full Time
Key Responsibilities:
- Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various Gen-AI base models
- Design and develop prompts suiting project needs
- Lead and manage team of prompt engineers
- Stakeholder management across business and domains as required for the projects
- Evaluating base models and benchmarking performance
- Implement prompt gaurdrails to prevent attacks like prompt injection, jail braking and prompt leaking
- Develop, deploy and maintain auto prompt solutions
- Design and implement minimum design standards for every use case involving prompt engineering
Skills and Qualifications
- Strong proficiency with Python, DJANGO framework and REGEX
- Good understanding of Machine learning framework Pytorch and Tensorflow
- Knowledge of Generative AI and RAG Pipeline
- Good in microservice design pattern and developing scalable application.
- Ability to build and consume REST API
- Fine tune and perform code optimization for better performance.
- Strong understanding on OOP and design thinking
- Understanding the nature of asynchronous programming and its quirks and workarounds
- Good understanding of server-side templating languages
- Understanding accessibility and security compliance, user authentication and authorization between multiple systems, servers, and environments
- Integration of APIs, multiple data sources and databases into one system
- Good knowledge in API Gateways and proxies, such as WSO2, KONG, nginx, Apache HTTP Server.
- Understanding fundamental design principles behind a scalable and distributed application
- Good working knowledge on Microservices architecture, behaviour, dependencies, scalability etc.
- Experience in deploying on Cloud platform like Azure or AWS
- Familiar and working experience with DevOps tools like Azure DEVOPS, Ansible, Jenkins, Terraform
Job Description:
- Experience in Core Java, Spring Boot.
- Experience in microservices.
- Extensive experience in developing enterprise-scale systems for global organization. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should be able to analyze, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS in SQL Server, Postgres, Oracle or DB2
- Good knowledge of multi-threading
- Basic working knowledge of Unix/Linux
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
- Should be able to express their design ideas and thoughts.
About Wissen Technology:
Wissen Technology is a niche global consulting and solutions company that brings unparalleled domain expertise in Banking and Finance, Telecom and Startups. Wissen Technology is a part of Wissen Group and was established in the year 2015. Wissen has offices in the US, India, UK, Australia, Mexico, and Canada, with best-in class infrastructure and development facilities. Wissen has successfully delivered projects worth $1 Billion for more than 25 of the Fortune 500 companies. The Wissen Group overall includes more than 4000 highly skilled professionals.
Wissen Technology provides exceptional value in mission critical projects for its clients, through thought leadership, ownership, and assured on-time deliveries that are always ‘first time right’.
Our team consists of 1200+ highly skilled professionals, with leadership and senior management executives who have graduated from Ivy League Universities like Wharton, MIT, IITs, IIMs, and NITs and with rich work experience in some of the biggest companies in the world.
Wissen Technology offers an array of services including Application Development, Artificial Intelligence & Machine Learning, Big Data & Analytics, Visualization & Business Intelligence, Robotic Process Automation, Cloud, Mobility, Agile & DevOps, Quality Assurance & Test Automation.
We have been certified as a Great Place to Work® for two consecutive years (2020-2022) and voted as the Top 20 AI/ML vendor by CIO Insider.
Greetings , Wissen Technology is Hiring for the position of Data Engineer
Please find the Job Description for your Reference:
JD
- Design, develop, and maintain data pipelines on AWS EMR (Elastic MapReduce) to support data processing and analytics.
- Implement data ingestion processes from various sources including APIs, databases, and flat files.
- Optimize and tune big data workflows for performance and scalability.
- Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions.
- Manage and monitor EMR clusters, ensuring high availability and reliability.
- Develop ETL (Extract, Transform, Load) processes to cleanse, transform, and store data in data lakes and data warehouses.
- Implement data security best practices to ensure data is protected and compliant with relevant regulations.
- Create and maintain technical documentation related to data pipelines, workflows, and infrastructure.
- Troubleshoot and resolve issues related to data processing and EMR cluster performance.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in data engineering, with a focus on big data technologies.
- Strong experience with AWS services, particularly EMR, S3, Redshift, Lambda, and Glue.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience with big data frameworks and tools such as Hadoop, Spark, Hive, and Pig.
- Solid understanding of data modeling, ETL processes, and data warehousing concepts.
- Experience with SQL and NoSQL databases.
- Familiarity with CI/CD pipelines and version control systems (e.g., Git).
- Strong problem-solving skills and the ability to work independently and collaboratively in a team environment
● 5+ years of experience as a Data engineer or related role.
● 5+ years of experience in application development using Python
● Strong experience with SQL and good to have NoSQL.
● Experience with Agile engineering practices.
● Preferred experience in writing queries for RDBMS, cloud-based data warehousing solutions like
Snowflake.
● Ability to work independently or as part of a team.
● Experience with cloud platforms, preferably AWS, is good to have
● Experience with ETL/LT tools and methodologies.
● Experience working on real-time Data Streaming and Data Streaming platforms
Job Location
Bengaluru, India
Job Type
Contractual Position with Potential for Full-Time Employment Based on Performance Review
Experience Required
4 years of experience in Java & Technologies
Company Description
Welcome to Unitalks Technologies! We are a managed services provider that helps our clients hire top-quality talent. We are hiring for one of our clients in Bengaluru, more details will be provided about the company and culture later in the process.
Role Description
This is a contract role for a Senior Java Software Engineer. As a Senior Java Software Engineer, you will be responsible for designing and developing software solutions, building and maintaining Microservices architectures, and programming using Spring Boot. This role is based in Bengaluru and is an on-site position.
Qualifications
- Software Development and Programming skills
- Experience with Microservices and the Spring Boot
- Proficiency in Java, SQL and UNIX
- Good in deployment on different platforms like AWS, Azure, etc.
- Strong problem-solving and analytical skills
- Excellent communication and teamwork abilities
- Bachelor's or Master's degree in Computer Science or a related field
- Experience with Agile methodologies and CI/CD pipelines
- Knowledge of cloud platforms and containerization technologies is a plus
- Sr. Solution Architect
- Job Location – Bangalore
- Need candidates who can join in 15 days or less.
- Overall, 12-15 years of experience.
Looking for this tech stack in a Sr. Solution Architect (who also has a Delivery Manager background). Someone who has heavy business and IT stakeholder collaboration and negotiation skills, someone who can provide thought leadership, collaborate in the development of Product roadmaps, influence decisions, negotiate effectively with business and IT stakeholders, etc.
- Building data pipelines using Azure data tools and services (Azure Data Factory, Azure Databricks, Azure Function, Spark, Azure Blob/ADLS, Azure SQL, Snowflake..)
- Administration of cloud infrastructure in public clouds such as Azure
- Monitoring cloud infrastructure, applications, big data pipelines and ETL workflows
- Managing outages, customer escalations, crisis management, and other similar circumstances.
- Understanding of DevOps tools and environments like Azure DevOps, Jenkins, Git, Ansible, Terraform.
- SQL, Spark SQL, Python, PySpark
- Familiarity with agile software delivery methodologies
- Proven experience collaborating with global Product Team members, including Business Stakeholders located in NA
Dear Candidate,
Deepika here from Amazech Solutions Pvt Ltd. Please find the below short JD for your Reference.
About AMAZECH SOLUTIONS
Amazech Solutions is a Consulting and Services company in the Information Technology Industry. Established in 2007, we are headquartered in Frisco, Texas, U.S.A. The leadership team at Amazech brings to the table expertise that stems from over 40-man years of experience in developing software solutions in global organizations in various verticals including Healthcare, Banking Services, and Media & Entertainment
We currently provide services to a wide spectrum of clients ranging from start-ups to Fortune 500 companies. We are actively engaged in Government projects, being an SBA approved company as well as being HUB certified by the State of Texas.
Our customer-centric approach comes from understanding that our clients need more than technology professionals. This is an exciting time to join Amazech as we look to grow our team in India which comprises of IT professionals with strong competence in both common and niche skill areas.
As an Inside Sales/Cold Calling Representative, you will be responsible for generating leads, making outbound calls to potential customers, and nurturing leads through the sales funnel. Your goal will be to identify and qualify prospects, establish rapport, and schedule meetings or appointments for our sales team.
Location: Bangalore (Hybrid)
Employment type: Full time.
Permanent website: www.amazech.com
JOB DESCRIPTION: Project Manager
Experience: 8-13 years
JOB DESCRIPTION:
Experience and Required Skill Sets:
1. Solid experience on software design and development using Microsoft
ASP.NET framework, C#, Visual Studio and in writing clean, readable and easily
maintainable code
2. 6+ years of experience in WCF, WPF, VB.NET, SQL Server, JQuery, Angular OR React Native.
3. 6+ years of experience in developing highly scalable REST/Web APIs
4. 6+ years of experience in developing highly efficient SQL/NoSQL Queries
5. 6+ years of experience in
Relational and Document databases
6. Strong understanding of object-oriented programming and possess strong
fundamental design principles for building a scalable application.
7. Experience in using the MVC framework, working knowledge on APIs and
Service Oriented Architecture
8. Strong understanding of object-oriented programming and possess strong
fundamental design principles for building a scalable application.
9. Experience in Angular version 5/6 and above is
required
10. Expert at SQL programming & Database Design (Stored procedure,
Function, Trigger, Constraints, Replication)
11. Strong ability in HTML and JavaScript (AJAX, DOM)
12. Development experience with data integration technologies including
REST, SOAP, JSON, XML
13. Expertise in programming and usage of any unit testing framework.
Expertise in integration testing and continuous integration.
14. Experience working in a DevOps environment will be an added advantage.
Other Requirements:
·
A Bachelor's or Master's degree (Engineering or
computer related degree preferred)
·
Strong understanding of Software Development Life
Cycles including Agile/Scrum
·
At least one relevant Microsoft Certification
preferred
Responsibilities:
·
Ability to create complex, enterprise-transforming
applications that meet and exceed client expectations.
·
Responsible for bottom line. Strong project
management abilities. Ability to encourage team to stick to timelines.
·
Follow best-in-class coding practices and review the
solutions vis-à-vis the requirements and technical specifications.
·
Should demonstrate strong client
interfacing capabilities like emails and calls with clients in the US/UK
at Gipfel & Schnell Consultings Pvt Ltd
Data Engineer
Brief Posting Description:
This person will work independently or with a team of data engineers on cloud technology products, projects, and initiatives. Work with all customers, both internal and external, to make sure all data related features are implemented in each solution. Will collaborate with business partners and other technical teams across the organization as required to deliver proposed solutions.
Detailed Description:
· Works with Scrum masters, product owners, and others to identify new features for digital products.
· Works with IT leadership and business partners to design features for the cloud data platform.
· Troubleshoots production issues of all levels and severities, and tracks progress from identification through resolution.
· Maintains culture of open communication, collaboration, mutual respect and productive behaviors; participates in the hiring, training, and retention of top tier talent and mentors team members to new and fulfilling career experiences.
· Identifies risks, barriers, efficiencies and opportunities when thinking through development approach; presents possible platform-wide architectural solutions based on facts, data, and best practices.
· Explores all technical options when considering solution, including homegrown coding, third-party sub-systems, enterprise platforms, and existing technology components.
· Actively participates in collaborative effort through all phases of software development life cycle (SDLC), including requirements analysis, technical design, coding, testing, release, and customer technical support.
· Develops technical documentation, such as system context diagrams, design documents, release procedures, and other pertinent artifacts.
· Understands lifecycle of various technology sub-systems that comprise the enterprise data platform (i.e., version, release, roadmap), including current capabilities, compatibilities, limitations, and dependencies; understands and advises of optimal upgrade paths.
· Establishes relationships with key IT, QA, and other corporate partners, and regularly communicates and collaborates accordingly while working on cross-functional projects or production issues.
Job Requirements:
EXPERIENCE:
2 years required; 3 - 5 years preferred experience in a data engineering role.
2 years required, 3 - 5 years preferred experience in Azure data services (Data Factory, Databricks, ADLS, Synapse, SQL DB, etc.)
EDUCATION:
Bachelor’s degree information technology, computer science, or data related field preferred
SKILLS/REQUIREMENTS:
Expertise working with databases and SQL.
Strong working knowledge of Azure Data Factory and Databricks
Strong working knowledge of code management and continuous integrations systems (Azure DevOps or Github preferred)
Strong working knowledge of cloud relational databases (Azure Synapse and Azure SQL preferred)
Familiarity with Agile delivery methodologies
Familiarity with NoSQL databases (such as CosmosDB) preferred.
Any experience with Python, DAX, Azure Logic Apps, Azure Functions, IoT technologies, PowerBI, Power Apps, SSIS, Informatica, Teradata, Oracle DB, and Snowflake preferred but not required.
Ability to multi-task and reprioritize in a dynamic environment.
Outstanding written and verbal communication skills
Working Environment:
General Office – Work is generally performed within an office environment, with standard office equipment. Lighting and temperature are adequate and there are no hazardous or unpleasant conditions caused by noise, dust, etc.
physical requirements:
Work is generally sedentary in nature but may require standing and walking for up to 10% of the time.
Mental requirements:
Employee required to organize and coordinate schedules.
Employee required to analyze and interpret complex data.
Employee required to problem-solve.
Employee required to communicate with the public.
Responsibilities:
· Analyze complex data sets to answer specific questions using MMIT’s market access data (MMIT) and Norstella claims data, third-party claims data (IQVIA LAAD, Symphony SHA). Applicant must have experience working with the aforementioned data sets exclusively.
· Deliver consultative services to clients related to MMIT RWD sets
· Produce complex analytical reports using data visualization tools such as Power BI or Tableau
· Define customized technical specifications to surface MMIT RWD in MMIT tools.
· Execute work in a timely fashion with high accuracy, while managing various competing priorities; Perform thorough troubleshooting and execute QA; Communicate with internal teams to obtain required data
· Ensure adherence to documentation requirements, process workflows, timelines, and escalation protocols
· And other duties as assigned.
Requirements:
· Bachelor’s Degree or relevant experience required
· 2-5 yrs. of professional experience in RWD analytics using SQL
· Fundamental understanding of Pharma and Market access space
· Strong analysis skills and proficiency with tools such as Tableau or PowerBI
· Excellent written and verbal communication skills.
· Analytical, critical thinking and creative problem-solving skills.
· Relationship building skills.
· Solid organizational skills including attention to detail and multitasking skills.
· Excellent time management and prioritization skills.
Responsibilities:
• Build customer facing solution for Data Observability product to monitor Data Pipelines
• Work on POCs to build new data pipeline monitoring capabilities.
• Building next-generation scalable, reliable, flexible, high-performance data pipeline capabilities for ingestion of data from multiple sources containing complex dataset.
•Continuously improve services you own, making them more performant, and utilising resources in the most optimised way.
• Collaborate closely with engineering, data science team and product team to propose an optimal solution for a given problem statement
• Working closely with DevOps team on performance monitoring and MLOps
Required Skills:
• 3+ Years of Data related technology experience.
• Good understanding of distributed computing principles
• Experience in Apache Spark
• Hands on programming with Python
• Knowledge of Hadoop v2, Map Reduce, HDFS
• Experience with building stream-processing systems, using technologies such as Apache Storm, Spark-Streaming or Flink
• Experience with messaging systems, such as Kafka or RabbitMQ
• Good understanding of Big Data querying tools, such as Hive
• Experience with integration of data from multiple data sources
• Good understanding of SQL queries, joins, stored procedures, relational schemas
• Experience with NoSQL databases, such as HBase, Cassandra/Scylla, MongoDB
• Knowledge of ETL techniques and frameworks
• Performance tuning of Spark Jobs
• General understanding of Data Quality is a plus point
• Experience on Databricks,snowflake and BigQuery or similar lake houses would be a big plus
• Nice to have some knowledge in DevOps
Job Description: Data Engineer
Experience: Over 4 years
Responsibilities:
- Design, develop, and maintain scalable data pipelines for efficient data extraction, transformation, and loading (ETL) processes.
- Architect and implement data storage solutions, including data warehouses, data lakes, and data marts, aligned with business needs.
- Implement robust data quality checks and data cleansing techniques to ensure data accuracy and consistency.
- Optimize data pipelines for performance, scalability, and cost-effectiveness.
- Collaborate with data analysts and data scientists to understand data requirements and translate them into technical solutions.
- Develop and maintain data security measures to ensure data privacy and regulatory compliance.
- Automate data processing tasks using scripting languages (Python, Bash) and big data frameworks (Spark, Hadoop).
- Monitor data pipelines and infrastructure for performance and troubleshoot any issues.
- Stay up to date with the latest trends and technologies in data engineering, including cloud platforms (AWS, Azure, GCP).
- Document data pipelines, processes, and data models for maintainability and knowledge sharing.
- Contribute to the overall data governance strategy and best practices.
Qualifications:
- Strong understanding of data architectures, data modelling principles, and ETL processes.
- Proficiency in SQL (e.g., MySQL, PostgreSQL) and experience with big data querying languages (e.g., Hive, Spark SQL).
- Experience with scripting languages (Python, Bash) for data manipulation and automation.
- Experience with distributed data processing frameworks (Spark, Hadoop) (preferred).
- Familiarity with cloud platforms (AWS, Azure, GCP) for data storage and processing (a plus).
- Experience with data quality tools and techniques.
- Excellent problem-solving, analytical, and critical thinking skills.
- Strong communication, collaboration, and teamwork abilities.
About HeyCoach:
We are an exceptional group of highly skilled individuals, passionate about addressing a fundamental challenge within the education industry. Our team consists of talented geeks who possess a deep understanding of the issues at hand and are dedicated to finding innovative solutions. In our quest for excellence, we are constantly seeking out remarkable individuals who can contribute to our growth and success. Whether it's developing cutting-edge technologies, designing immersive learning experiences, or implementing groundbreaking teaching methodologies, we consistently strive for excellence.
Job Description:
- Mobile App Development: Collaborate with cross-functional teams to design, develop, test, and deploy robust and scalable Android applications.
- Code Optimisation: Write clean, maintainable, and efficient code, with a focus on performance and responsiveness. Identify and address bottlenecks and bugs.
- UI/UX Implementation: Work closely with designers to implement visually appealing and intuitive user interfaces. Ensure seamless integration between the front-end and back-end components.
- API Integration: Integrate with RESTful APIs and third-party services to enhance app functionality and data exchange.
- Testing and Debugging: Conduct thorough testing of applications, including unit testing and debugging. Collaborate with quality assurance teams to ensure the delivery of high-quality software.
- Platform Compatibility: Stay updated on the latest Android platform updates and ensure compatibility with various devices and screen sizes.
- Performance Optimization: Continuously optimize application performance, keeping up with best practices and industry standards.
- Collaboration: Work closely with other developers and team members to achieve project goals. Participate in code reviews and knowledge-sharing sessions.
Requirements:
- Engineers with 1-6 years of experience in shipping consumer-facing Android apps with the large user base, ideally currently available on the Google Play Store.
- Top-notch programming skills in Java, Kotlin and Android along with MVVM, Dagger2, Room, LiveData, Coroutine & JetPack-Components.
- Command of memory management, view hierarchy, battery optimisation and in-depth experience with multi threaded and networked applications.
- Worked with Restful APIs, third-party SDK Integrations and common technologies like HTTPS, JSON, OAuth, and SQL.
- Thorough working knowledge of Android Studio with the Gradle build system.
- Understanding the quirks of the fragmented ecosystem of Android OS versions and devices.
- Focus on Material Design principles and pixel-perfect implementation of the design into code.
- Solid experience with Git Care for quality with an obsession for performance and willingness to spend time testing the team's work as well as yours.
Position: Java Developer
Experience: 3-8 Years
Location: Bengaluru
We are a multi-award-winning creative engineering company offering design and technology solutions on mobile, web and cloud platforms. We are looking for an enthusiastic and self-driven Test Engineer to join our team.
Roles and Responsibilities:
- Expert level Micro Web Services development skills using Java/J2EE/Spring
- Strong in SQL and noSQL databases (mySQL / MongoDB preferred) Ability to develop software programs with best of design patterns , data Structures & algorithms
- Work in very challenging and high performance environment to clearly understand and provide state of the art solutions ( via design and code)
- Ability to debug complex applications and help in providing durable fixes
- While Java platform is primary, ability to understand, debug and work on other application platforms using Ruby on Rails and Python
- Responsible for delivering feature changes and functional additions that handle millions of requests per day while adhering to quality and schedule targets
- Extensive knowledge of at least 1 cloud platform (AWS, Microsoft Azure, GCP) preferably AWS.
- Strong unit testing skills for frontend and backend using any standard framework
- Exposure to application gateways and dockerized. microservices
- Good knowledge and experience with Agile, TDD or BDD methodologies
Desired Profile:
- Programing language – Java
- Framework – Spring Boot
- Good Knowledge of SQL & NoSQL DB
- AWS Cloud Knowledge
- Micro Service Architecture
Good to Have:
- Familiarity with Web Front End (Java Script/React)
- Familiarity with working in Internet of Things / Hardware integration
- Docker & Kubernetes Serverless Architecture
- Working experience in Energy Company (Solar Panels + Battery)
At Advisor360°, we hire people with all kinds of awesome experiences, backgrounds, and perspectives. We like it that way. So even if you don’t meet every single requirement, please consider applying if you like what you see.
As a Full stack Software Engineer, you’ll be part of a team that’s responsible for developing several of Advisor360°’s most visible and critical web applications, including our public-facing application, Investor360°. These products are at the heart of what we offer to our advisors and their clients. In addition to having an Agile mindset and a desire to produce great software, you’ll need a positive attitude and excellent communication skills.
Key responsibilities
- Plan and implement mid- to large-scale projects from conception to completion
- Understand how to adapt theory and best practices to fit the needs of the project
- Deep knowledge and understanding of technology software design patterns and code concepts
- Demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, code comments, and clear code
- Troubleshoot, debug, and upgrade existing systems
- Deploy programs and evaluate user feedback
- Document and maintain software functionality
- Show an appetite and aptitude for taking responsibility for technical decisions
- Assist with the direction for the team
- Collaborate with team members on effective development practices and communicate with tact, professionalism, and an eye toward team progression
Requirements
- 5+ years of programming experience in ASP.NET (C# or VB.NET), including MVC and Web API
- Experience serving as technical lead throughout the full software development lifecycle: conception, architecture definition, detailed design, scoping, planning, implementation, testing, documentation, delivery, and maintenance is preferred
- Knowledge of professional software engineering and best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations
- Knowledge of Angular 5/npm/Swagger/TypeScript/.NET Core, or related technologies, along with client-side frameworks and languages, such as jQuery
- Proficiency in SQL/relational databases and TFS including building and release definitions within a CI and CD environment
Additional skills and knowledge
- Knowledge of Microsoft Azure platform a plus
- Ability to work in a fast-paced, Agile/Scrum environment
- A positive attitude and excellent communication skills
- An Agile mindset and a desire to produce great software
About the role
A backend engineer at Peoplebox is a one who loves to look for patterns and extremely resourceful.
You will be exposed to different challenges right from building multi-source integration platform to working deeply in Kubernetes clusters. At the core, you should be great at building system design, great at SQL, breaking down broad-problems into smaller sub-problems and build clarity.
We also value self-drivenness. Have you done something that you did out of what was asked for you to be done? Did you do a hobby project? Did you host it somewhere? Did you try to get some users to it? Did you do something at work that was not really part of the sprint but it added value?
We are looking for a self-driven Senior Backend Engineer who can join us and take ownership of releases that they do.
Roles and Responsibilities
- Typical Day: Your typical day would be building data models and APIs that would be consumed by Frontend engineers.
- Teamwork: Your success is directly proportional to how well you collaborate with your Frontend Partner. We’d expect you to be pro-active and communicate well over Slack and Calls.
- Problem Solving: You’ll be solving hard problems in architecture and integrations that enable the business.
- Delivery: We really value timely delivery. Speed is what makes us different. You should be ready to move fast and release.
- Self-Driven: We’d like people who don’t like to be managed and want to be self-driven. You are not waiting for someone to tell you what to do.
Experience:
Hands-on experience creating web applications using Ruby on Rails
Knowledge of Rails-related tools and practices like RSpec, TDD/BDD
Good understanding of HTTP and how the web works.
Proficient with database schema design, indexing, query optimization
Good understanding of MVC design pattern
Experience using “git” is desired but not mandatory.
Basic knowledge of web technologies desired (HTML, CSS, JS and jQuery)
Good knowledge of automated deployment, monitoring and performance analysis
Hands-on DevOps experience is greatly desired
Knowledge of cloud and server infrastructure (AWS in particular) will be a huge plus
Reputed Client
Application Integration Engineer - HR/Payroll
Skills:
· Experience on HR Payroll SaaS
o HR/Payroll applications: Ideally UKG, multiple others ok
o Financial applications: ideally knowledge of how HR/Payroll systems interact with ERPs (ideally NetSuite), and other personnel expenses (i.e., Expensify, Concur)
· Knowledge of various Integrations approaches:
o Native connectors
o SFTP files
o API – schema, design
· Knowledge of multiple coding languages: Java script, XML, REST, SOAP etc.
· Ability to work with data bases (ideally SQL, Oracle, PostgreSQL), including ETL
· Some knowledge of Azure services, mainly SQL and Oracle databases, storage, analytics
· Experience working with global teams
· Ability to overlap some work hours with US EST/CT time
· Solid communication skills
· Proactive, takes initiative, out-spoken
Experience
· 5-8 of years
Greetings!
Wissen Technology is hiring for Java developers/Lead.
Required Skillset
- 6+ years of industrial experience
- Java and related frameworks.
- Experience in Core Java 1.8 and above, Data Structures, OOPS, Multithreading, Microservices, Spring and Swift Payments
- Exposure on Fintech domain is mandatory
- Should have the ability to analyse, design, develop and test complex, low-latency client-facing applications.
- Good development experience with RDBMS
- Good knowledge of multi-threading and high volume server side development
- Excellent problem solving and coding skills in Java
- Strong interpersonal, communication and analytical skills.
Company Profile:
Company Name : Wissen Technology
Group of companies in India : Wissen Technology & Wissen Infotech
Work Location - Bangalore
Website : www.wissen.com
Wissen Thought leadership : https://www.wissen.com/articles/
LinkedIn: https://www.linkedin.com/company/wissen-technology
- As a data engineer, you will build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. You ultimate goal is to make data accessible for organizations to optimize their performance.
- Work closely with PMs, business analysts to build and improvise data pipelines, identify and model business objects • Write scripts implementing data transformation, data structures, metadata for bringing structure for partially unstructured data and improvise quality of data
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL
- Own data pipelines - Monitoring, testing, validating and ensuring meaningful data exists in data warehouse with high level of data quality
- What we look for in the candidate is strong analytical skills with the ability to collect, organise, analyse, and disseminate significant amounts of information with attention to detail and accuracy
- Create long term and short-term design solutions through collaboration with colleagues
- Proactive to experiment with new tools
- Strong programming skill in python
- Skillset: Python, SQL, ETL frameworks, PySpark and Snowflake
- Strong communication and interpersonal skills to interact with senior-level management regarding the implementation of changes
- Willingness to learn and eagerness to contribute to projects
- Designing datawarehouse and most appropriate DB schema for the data product
- Positive attitude and proactive problem-solving mindset
- Experience in building data pipelines and connectors
- Knowledge on AWS cloud services would be preferred
The Above Mentioned budget is for
Experience with QE for distributed, highly scalable systems • Good understanding of OOPS concepts and strong programming skills in Java, Groovy or JavaScript • Hands on experience of working with at least one of GUI based test automation tools for desktop and/or mobile automation. Experience on multiple tools will be added advantage • Proficient in writing SQL queries • Familiarity with process of test automation tool selection & test approach • Experience in designing and development of automation framework and creation of scripts using best industry practices such as Page object model • Integrate test suites into the test management system and custom test harness • Familiar with implementation of design patterns, modularization, and user libraries for framework creation • Can mentor team as well as has short learning curve for new technology • Understands all aspects of Quality Engineering • Understanding of SOAP and REST principles • Thorough understanding of microservices architecture • In-depth hands-on experience of working with at least one API testing tool like RestAssured, SOAP UI, NodeJS • Hands-on experience working with Postman or similar tool • Hands-on experience in parsing complex JSON & XML and data validation using serialization techniques like POJO classes or similar • Hands-on experience in performing Request and Response Schema validation, Response codes and exceptions • Good Understanding of BDD, TDD methodologies and tools like Cucumber, TestNG, Junit or similar. • Experience in defining API E2E testing strategy, designing and development of API automation framework • Working experience on build tools Maven / Gradle, Git etc. • Experience in creating test pipeline – CI/CD
Title:- Lead Data Engineer
Experience: 10+y
Budget: 32-36 LPA
Location: Bangalore
Work of Mode: Work from office
Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS
Qualification: Any Engineering degree
Roles and Responsibilities:
• 8 - 10+ years’ experience in developing scalable Big Data applications or solutions on
distributed platforms.
• Able to partner with others in solving complex problems by taking a broad
perspective to identify.
• innovative solutions.
• Strong skills building positive relationships across Product and Engineering.
• Able to influence and communicate effectively, both verbally and written, with team
members and business stakeholders
• Able to quickly pick up new programming languages, technologies, and frameworks.
• Experience working in Agile and Scrum development process.
• Experience working in a fast-paced, results-oriented environment.
• Experience in Amazon Web Services (AWS) mainly S3, Managed Airflow, EMR/ EC2,
IAM etc.
• Experience working with Data Warehousing tools, including SQL database, Presto,
and Snowflake
• Experience architecting data product in Streaming, Serverless and Microservices
Architecture and platform.
• Experience working with Data platforms, including EMR, Airflow, Databricks (Data
Engineering & Delta
• Lake components, and Lakehouse Medallion architecture), etc.
• Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for
Managed Spark jobs, build Docker images, etc.
• Experience working with distributed technology tools, including Spark, Python, Scala
• Working knowledge of Data warehousing, Data modelling, Governance and Data
Architecture
• Working knowledge of Reporting & Analytical tools such as Tableau, Quicksite
etc.
• Demonstrated experience in learning new technologies and skills.
• Bachelor’s degree in computer science, Information Systems, Business, or other
relevant subject area
Title:- Senior Data Engineer
Experience: 4-6 yrs
Budget: 24-28 lpa
Location: Bangalore
Work of Mode: Work from office
Primary Skills: Data Bricks, Spark, Pyspark,Sql, Python, AWS
Qualification: Any Engineering degree
Responsibilities:
∙Design and build reusable components, frameworks and libraries at scale to support analytics products.
∙Design and implement product features in collaboration with business and Technology
stakeholders.
∙Anticipate, identify and solve issues concerning data management to improve data quality.
∙Clean, prepare and optimize data at scale for ingestion and consumption.
∙Drive the implementation of new data management projects and re-structure of the current data architecture.
∙Implement complex automated workflows and routines using workflow scheduling tools.
∙Build continuous integration, test-driven development and production deployment
frameworks.
∙Drive collaborative reviews of design, code, test plans and dataset implementation performed by other data engineers in support of maintaining data engineering standards.
∙Analyze and profile data for the purpose of designing scalable solutions.
∙Troubleshoot complex data issues and perform root cause analysis to proactively resolve
product and operational issues.
∙Mentor and develop other data engineers in adopting best practices.
Qualifications:
Primary skillset:
∙Experience working with distributed technology tools for developing Batch and Streaming pipelines using
o SQL, Spark, Python, PySpark [4+ years],
o Airflow [3+ years],
o Scala [2+ years].
∙Able to write code which is optimized for performance.
∙Experience in Cloud platform, e.g., AWS, GCP, Azure, etc.
∙Able to quickly pick up new programming languages, technologies, and frameworks.
∙Strong skills building positive relationships across Product and Engineering.
∙Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
∙Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc.
∙Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture
Good to have:
∙Experience working with Data platforms, including EMR, Airflow, Databricks (Data
Engineering & Delta Lake components, and Lakehouse Medallion architecture), etc.
∙Experience working in Agile and Scrum development process.
∙Experience in EMR/ EC2, Databricks etc.
∙Experience working with Data warehousing tools, including SQL database, Presto, and
Snowflake
∙Experience architecting data product in Streaming, Serverless and Microservices Architecture and platform.
About Company :-
Our Client is the world’s largest media investment company and is a part of WPP. In fact, we are responsible for one in every three ads you see globally. We are currently looking for a Manager -Social to join us. As part of the largest media agency in India, you’ll have the opportunity to leverage the scale that comes with the job. You will become an integral part of this growing team and will be working with both internal teams and external parties to ensure campaign delivery objectives are met.
This team is responsible for delivering international solutions, particularly in APAC & EMEA, with some global influence. You will enjoy working in a collaborative team environment and will hold a ‘can do’ attitude with the passion to learn and grow.
At APAC, our people are our strength, which is why fostering a culture of diversity and inclusion is important to us.
Key Responsibilities:
- Lead QA team for effective testing, validation, and verification.
- Implement QA processes and standards.
- Design and execute test plans, cases, and automated scripts.
- Utilize SQL, Excel, and Python for data analysis and testing strategies.
- Validate data across platforms and reports.
- Analyze data for system improvement.
- Monitor and report key quality metrics.
- Develop and maintain process documentation.
Requirements
- Bachelor’s degree in CS, IT, Engineering, or related field.
- 3+ years of QA experience.
- Proficient in SQL, Excel, and Python.
- Strong knowledge of QA methodologies and tools.
- Experience in commerce or digital marketing preferred.
- Demonstrated leadership skills.
- Excellent problem-solving and communication.
- Proactive in task optimization.
- Systems approach to problem-solving.
- Strong attention to detail.
Looking for a .Net Core Engineer | Bangalore to join a team of rockstar developers. The candidate should have a minimum of 2+ yrs. of experience in .Net Core.There are multiple openings. If you're looking for career growth & a chance to work with the top 0.1% of developers in the industry, this one is for you! You will report into IIT'ans/BITS grads with 10+ years of development experience + work with F500 companies (our customers).Company Background - CodeVyasa is a Software Product-Engineering and Development company that helps Early-stage & Mid-Market Product companies with IT Consulting, App Development, and On-demand Tech Resources. Our Journey over the last 3 years has been nothing short of a roller-coaster. Along our way, we've won some of the most prestigious awards while driving immense value to our customers & employees. Here's the link to our website (codevyasa.com). To give you a sense of our growth rate, we've added 70+ employees in the last 6 weeks itself and expect another 125+ by the end of Q1 2024
Requirements:
1)Bachelor's degree in Computer Science, Information Technology, or related field (or equivalent experience).
2)Minimum of 2 years of experience as a .Net Developer
3)Proficiency in MVC Framework, WEBAPI, Webform, C#,sql queries
4) Aptitude for learning new technologies quickly.
5)Good problem-solving and analytical skills
.What We Offer:
1)Glassdoor rating of 4.8, indicating high employee satisfaction.
2) Free healthcare benefits.
3)Strong focus on upskilling and professional development opportunities.
4)Diverse and inclusive work environment.
5)Competitive compensation and benefits package.
6)Emphasis on maintaining a healthy work-life balance.
About Vyapar:
We are a technology and innovation company in the fintech space, delivering business accounting software to Micro, Small & Medium Enterprises (MSMEs). With more than 5 Million users across 140 countries, we are one of the fastest growing companies in this space. We take the complexity out of invoicing, inventory management & accounting, making it so simple, such that small businesses can spend less time on manual bookkeeping and spend more time focusing on areas of business that matter.
Role Summary:
Vyapar's Engineering team builds the technology platform that eases and digitizes our customers' bookkeeping and enables the transition of cumbersome accounting data from general bookkeeping to a digitized always available resource.
The Javascript engineer will be responsible for the developing features in Vyapar application. Strong understanding of HTML, CSS, JavaScript, Responsive design, JQuery, React, database etc. concepts is critical
Key Responsibilities:
- Translate designs and wireframes into high-quality code.
- Design, build and maintain high performance, reusable, and reliable code.
- Ensure the best possible performance, quality, and responsiveness of the application.
- Use of complex algorithm to build the business requirements.
- Help maintain code quality, organization, and automatization.
- Ability to understand business requirements and translate them into technical requirements
Desired Skills And Requirements
Must have
- Strong JavaScript
- HTML, CSS
- React/Angular/JQuery/Vue
- Problem-solving skills, and Data Structures.
- Strong knowledge of SQL database or RDBMS.
Good to have
- Familiarity with RESTful APIs to connect applications to back-end services.
- Strong knowledge of Web UI design principles, patterns, and best practices.
- Experience with offline storage and performance tuning.
- Experience and understanding of database concepts and sql queries.
- Familiarity with cloud message APIs and push notifications.
- Familiarity with continuous integration.
- A knack for benchmarking and optimization.
Experience:
- Minimum 4 years of experience in JavaScript
- Minimum 3 years of experience in HTML, CSS
- Minimum 3 years of experience in SQL/ RDBMS.
Education:
- A full-time B.E/ B.Tech Degree from a recognized university.
.Net Core is preferred along with SQL or very strong .Net developer with SQL
The job requires the resource to do documentation, SQL support, development with .Net,integration,interaction with end user- mix of all activities
at Elocity Technologies India Private Limited
Elocity is a cleantech start-up striving to make the world a better place through technology innovations. We are building a global infrastructure for making the transition to electric vehicles viable, affordable, and sustainable by working closely with the utilities, governments, and public.
Headquartered out of Canada, we are a team of highly specialized domain experts and problem solvers enabling utilities, public and private sector entities to successfully manage the demands of electric vehicle charging and its infrastructure needs to pave the way for electromobility in future.
To know more visit https://elocitytech.com/
Responsibilities:
- Determines technical feasibility of features or solutions by evaluating problem, customer requirements, possible solutions and technology requirements.
- Exercises judgement in prioritizing tasks and selecting methods and techniques for obtaining solutions.
- Create low-level design of modules of a software application through proper documentation and
- diagrams.
- Develops software solutions by studying requirements, clarifying customer/user needs, analysing data
- and processes and following established software development practices and processes.
- Develops proof of concepts for technical evaluation and early customer feedback
- Updates and shares knowledge by studying state-of-the-art development tools, programming
- techniques, and computing technology; reading professional publications
- Networks with internal and external personnel in own area of expertise.
- Skills:
- Good command in JavaScript/TypeScript. Knowledge of Java/Python will be a plus.
- Experience in Debugging/troubleshooting TypeScript code.
- Experience in API development (REST/GraphQL etc)
- Experience in development of Web and Mobile(android/iOS) applications
- Exposure of Parallel and Asynchronous programming
- Experience in writing Unit tests (Jest or any similar framework)
- Should be proficient in relational Database concepts (Postgres etc.)
- Knowledge of Non-relational Databases would be a plus.
- Good Understanding of Object-Oriented Programming Concepts.
- Good Understanding of Design Patterns.
- Good command of Data structures, Algorithms and Complexity.
- Good at problem solving and analytical skills.
- Experience with Source Code Versioning systems (Git etc)
- Understanding of Micro services Architecture would be a plus