

About Hvantage Technologies Inc.
About
A Global Technology and Outsourcing Company with domain expertise on Banking, Consumer Products, High tech, Insurance, Financial Services and Retail, Hvantage Technologies Inc brings in world-class software solutions and support services tailored to the unique requirements for clients across the world. Our IT services & products suit the viability to serve the global IT domain requirements
Hvantage was established in 2011 to provide technology and operations services. Hvantage has more than 200 passionate technologist, operations associates, and leaders providing web, mobile, enterprise, data solutions, and operations support to our customers.
Hvantage currently operates from Los Angeles, the USA with offshore development centre at Indore, India. We are a part of the reputed DCNPL Group that has business interests in the high technology industry.
Hvantage Technologies Inc. delivers exceptional software solutions and support services to its clients using its proven onsite & offshore engagement model. We have decades of cumulative experience (within the resources) in the outsourcing and offshoring space.
Hvantage culture is designed to provide quality, customer focus, and teamwork to our customers. Our culture promotes the opportunity for all employees by embracing value creation.
Connect with the team
Similar jobs

Job Title: Software Development Intern
Company: KGIST Microcollege (MGC)
Location: [Insert Location]
Job Type: Internship
Job Description:
- Assist in developing software applications using various programming languages
- Collaborate with the development team to design, develop, and test software solutions
- Develop knowledge and skills in software development practices and procedures
Responsibilities:
- Develop software applications using JavaScript, Java, HTML, Python, and C++
- Assist in debugging and troubleshooting code
- Participate in code reviews and contribute to improving code quality
- Collaborate with the team to meet project deadlines and goals
Requirements:
- Currently pursuing a degree in Computer Science or a related field
- Strong knowledge of programming languages, including:
- JavaScript
- Java
- HTML
- Python
- C++
- Female candidates only
- Strong problem-solving skills and attention to detail
- Ability to work in a team environment
What We Offer:
- Opportunity to gain hands-on experience in software development
- Collaborative and dynamic work environment
- Flexible work hours and remote work options (if applicable)
- Chance to develop skills and knowledge in software development practices


Requirements:
- Practical experience with AngularJS/Angular/ReactJs, etc.
- Practical experience with server side technologies like Django, Ruby on Rails, etc is a plus.
You must be:
- A quick, analytical thinker
- Someone who wants to learn how to write quality code
What you will be doing:
- A lot of heavy JS changes for improving the user experience.
- Help build out the front-end completely once all the information from the backend has been dumped to a template.
You will be responsible for designing, building, and maintaining data pipelines that handle Real-world data at Compile. You will be handling both inbound and outbound data deliveries at Compile for datasets including Claims, Remittances, EHR, SDOH, etc.
You will
- Work on building and maintaining data pipelines (specifically RWD).
- Build, enhance and maintain existing pipelines in pyspark, python and help build analytical insights and datasets.
- Scheduling and maintaining pipeline jobs for RWD.
- Develop, test, and implement data solutions based on the design.
- Design and implement quality checks on existing and new data pipelines.
- Ensure adherence to security and compliance that is required for the products.
- Maintain relationships with various data vendors and track changes and issues across vendors and deliveries.
You have
- Hands-on experience with ETL process (min of 5 years).
- Excellent communication skills and ability to work with multiple vendors.
- High proficiency with Spark, SQL.
- Proficiency in Data modeling, validation, quality check, and data engineering concepts.
- Experience in working with big-data processing technologies using - databricks, dbt, S3, Delta lake, Deequ, Griffin, Snowflake, BigQuery.
- Familiarity with version control technologies, and CI/CD systems.
- Understanding of scheduling tools like Airflow/Prefect.
- Min of 3 years of experience managing data warehouses.
- Familiarity with healthcare datasets is a plus.
Compile embraces diversity and equal opportunity in a serious way. We are committed to building a team of people from many backgrounds, perspectives, and skills. We know the more inclusive we are, the better our work will be.


CoreStack, an AI-powered multi-cloud governance solution, empowers enterprises to rapidly achieve Continuous and Autonomous Cloud Governance at Scale. CoreStack enables enterprises to realize outcomes such as 40% decrease in cloud costs and 50% increase in operational efficiencies by governing operations, security, cost, access, and resources. CoreStack also assures 100% compliance with standards such as ISO, FedRAMP, NIST, HIPAA, PCI-DSS, AWS CIS & Well Architected Framework (WAF). We work with many large global customers across multiple industries including Financial Services, Healthcare, Retail, Education, Telecommunications, Technology and Government.
CoreStack closed a $8.5 million Series A financing recently. CoreStack was recognized as IDC Innovator in Cloud Management Solutions and in the Gartner Magic quadrant for Cloud Management Platforms in 2020. Earlier in 2019, Gartner named CoreStack as a Cool Vendor in Cloud Computing. CoreStack is a Microsoft Azure Gold & Co-Sell Partner and Amazon AWS Advanced Technology Partner.
• Lead a team of open source developers in building Cloud Products/Solutions
• Participate in R&D activities led by architects
• Contribute in product engineering and system integration
• Perform system integration for various private cloud platforms
• Communicate effectively with various stake-holders involved in the delivery process and getting stuff done
Skills Required
• Experience integrating/working with Openstack Cloud Platform is a must
• Must be well versed with Openstack EcoSystem
• An inquisitiveness to understand the intricacies at the code-level and at the same time understand the constituents of the bigger picture
• Hands-on hacker mentality with excellent coding skills and a passion to develop high-quality code
• Understand design patterns and their significance in software development
• Ability to context-switch between different problems at the same time efficiently and prioritize them properly before tackling them
• Must be able to inspire juniors and command the respect of peers with technical acumen and attitude
• Contribution to any open-source projects, blogging on technical topics, creation of any tool/framework/library, taking part in hackathons/coding challenges/workshops, being part of any technical group/forum will all be valued
CoreStack Offers
- Competitive salary
- Competitive benefit package with appreciable equity
- Exciting, fast-paced and entrepreneurial culture
- Health insurance and other company benefits
DYT - Do Your Thng, is an app, where all social media users can share brands they love with their followers and earn money while doing so! We believe everyone is an influencer. Our aim is to democratise social media and allow people to be rewarded for the content they post. How does DYT help you? It accelerates your career through collaboration opportunities with top brands and gives you access to a community full of experts in the influencer space.
Min 3-4 years of experience in a sales organisation
Minimum 4 to 14 Years OTM Technical / Functional experience
- Must have functional and techno-functional experience in OTM implementation and/or have supported projects dealing with these systems, and/or have worked on OTM Cloud
- Must have strong technical and functional knowledge of the latest OTM Application
- Must have knowledge of preparing mapping document to interface OTM system with EDI, WMS, order management and finance systems, and be able to translate the functional specifications into design specifications for the Offshore Delivery Centre
- Must have ability to write SQL statements for automation agents and other technical topics
- Hands on experience on JSPX/XSL will be additional advantage.
- Experience in end-to-end OTM life cycle/implementations/up-gradation and OTM architecture will be preferred
-
Bachelor’s or master’s degree in Computer Engineering, Computer Science, Computer Applications, Mathematics, Statistics, or related technical field. Relevant experience of at least 3 years in lieu of above if from a different stream of education.
-
Well-versed in and 3+ hands-on demonstrable experience with: ▪ Stream & Batch Big Data Pipeline Processing using Apache Spark and/or Apache Flink.
▪ Distributed Cloud Native Computing including Server less Functions
▪ Relational, Object Store, Document, Graph, etc. Database Design & Implementation
▪ Micro services Architecture, API Modeling, Design, & Programming -
3+ years of hands-on development experience in Apache Spark using Scala and/or Java.
-
Ability to write executable code for Services using Spark RDD, Spark SQL, Structured Streaming, Spark MLLib, etc. with deep technical understanding of Spark Processing Framework.
-
In-depth knowledge of standard programming languages such as Scala and/or Java.
-
3+ years of hands-on development experience in one or more libraries & frameworks such as Apache Kafka, Akka, Apache Storm, Apache Nifi, Zookeeper, Hadoop ecosystem (i.e., HDFS, YARN, MapReduce, Oozie & Hive), etc.; extra points if you can demonstrate your knowledge with working examples.
-
3+ years of hands-on development experience in one or more Relational and NoSQL datastores such as PostgreSQL, Cassandra, HBase, MongoDB, DynamoDB, Elastic Search, Neo4J, etc.
-
Practical knowledge of distributed systems involving partitioning, bucketing, CAP theorem, replication, horizontal scaling, etc.
-
Passion for distilling large volumes of data, analyze performance, scalability, and capacity performance issues in Big Data Platforms.
-
Ability to clearly distinguish system and Spark Job performances and perform spark performance tuning and resource optimization.
-
Perform benchmarking/stress tests and document the best practices for different applications.
-
Proactively work with tenants on improving the overall performance and ensure the system is resilient, and scalable.
-
Good understanding of Virtualization & Containerization; must demonstrate experience in technologies such as Kubernetes, Istio, Docker, OpenShift, Anthos, Oracle VirtualBox, Vagrant, etc.
-
Well-versed with demonstrable working experience with API Management, API Gateway, Service Mesh, Identity & Access Management, Data Protection & Encryption.
Hands-on experience with demonstrable working experience with DevOps tools and platforms viz., Jira, GIT, Jenkins, Code Quality & Security Plugins, Maven, Artifactory, Terraform, Ansible/Chef/Puppet, Spinnaker, etc.
-
Well-versed in AWS and/or Azure or and/or Google Cloud; must demonstrate experience in at least FIVE (5) services offered under AWS and/or Azure or and/or Google Cloud in any categories: Compute or Storage, Database, Networking & Content Delivery, Management & Governance, Analytics, Security, Identity, & Compliance (or) equivalent demonstrable Cloud Platform experience.
-
Good understanding of Storage, Networks and Storage Networking basics which will enable you to work in a Cloud environment.
-
Good understanding of Network, Data, and Application Security basics which will enable you to work in a Cloud as well as Business Applications / API services environment.

We are looking for two Sr FullStack (JS) Engineers with a keen eye for great design & UX. You will be responsible for the development of new software products (internal) and solving complex technical challenges for scale-ups and enterprise companies. You should excel in working with large-scale applications and frameworks and have outstanding communication and leadership skills.
Responsibilities
- Write clean, high-quality, high-performance and maintainable code
- Solve complex technical problems
- Perform an objective analysis of the problem statement and come up with an unbiased technical solution before writing a single line of code
- Coordinate cross-functionally to ensure the project meets business objectives and compliance standards
- Participate in and drive code reviews
Requirements
- Excellent attention to detail.
- Outstanding written and verbal communication skills.
- Demonstrated expertise of building production-grade and high-performance applications using ES2019/ES2020
- Expert at converting Figma, Sketch prototypes into pixel-perfect screens
- Highly proficient in React, Redux and TypeScript / ReasonML.
- Must be a self-starter who can work well with minimal-to-no guidance in a fluid environment
- Must be excited by challenges surrounding the development of highly scalable & distributed systems.
- Agility and ability to adapt quickly to changing requirements and scope and priorities
- Experience of working on massively large scale data systems in production environments
Preferred Requirements
- Production-level experience of designing a product for multiple international markets and languages (i18n, l10n).
- Bonus points for prior experience in Go & Python.
- Bonus points for open-source contributions, side-projects, blog posts and YT tech videos.
Qualifications
- BS in Computer Science (or related field)
- 4-5 years of relevant work experience



