
Masters India is a licensed GST Suvidha Provider (GSP) appointed by Goods and Services Tax Network (GSTN), a Government of India enterprise. As of now, Masters India comes under top 5 GST Suvidha Provider which was appointed by GST Council of India. The company aims is to build intuitive software solutions for complex problems faced by businesses across the globe. We are fulfilling our mission by offering tax and financial automation products to enterprises.
Masters India –A Family Business Spanning Decades. We are part of a business group established by Shri Chandra Prakash Agrawal in 1978. Mr Agarwal established the second copper melting plant in Delhi and since then, we have continuously grown. Masters India was established in 1999 and presently, manufactures aluminium wire rod, sheet and foil. Family has manufacturing facilities in Delhi, Rajasthan and Maharashtra. We also have operations in Healthcare, Hospitality and IT sector.
Key Responsibilities:
- Onboard enterprise customers for solutions delivered by Masters India.
- Project Manager for integration projects undertaken for enterprise customers.
- Maintain highest standards of customer support to be provided
- Follow up for resolution with DEV/QA team on the issues raised by the customer.
- Continuous customer success approach with zero attrition to be ensured
- Filing GST Returns for enterprise customers
- Providing Reconciliation Statement to the clients.
- Resolving queries related to E-invoice and EWB for clients
Required Skills:
- Should have highly developed all round interpersonal skills. ·
- Should have impeccable communication skills. ·
- Excellent communication, presentation, networking and negotiation skills.
- Customer service oriented with a positive, well-motivated attitude. ·
- Conscientious, hard-working and driven mindset to improve sales and activity performance beyond targets.

About Masters India Private Limited
About
Masters India is a GST Suvidha Provider (GSP) appointed by the Goods and Services Tax Network (GSTN), a Government of India enterprise. Our mission is to build intuitive software solutions that address complex challenges faced by businesses across the globe. We enable enterprises to automate their tax and financial compliance processes with powerful, easy-to-use products.
While most enterprises rely on ERPs to streamline their operations, compliance is often still managed manually through spreadsheets and government portals—creating inefficiencies and errors. Masters India addresses this gap by providing advanced compliance automation solutions that reduce manual work, enhance accuracy, and improve overall ease of doing business. To date, we have successfully processed over 150 million e-Invoices for 1,500+ enterprise customers.
Candid answers by the company
Masters India is a GSTN-appointed provider helping enterprises automate tax compliance with powerful, intuitive software.
Connect with the team
Similar jobs
Role Overview:
The Partnership Associate will focus on nurturing existing partner relationships and identifying and onboarding new partners to strengthen Marmeto’s ecosystem. This role is critical for expanding strategic alliances and driving mutual growth.
Key Responsibilities:
- Manage and nurture relationships with existing partners, ensuring alignment and mutual growth.
- Identify and onboard new partners to expand Marmeto’s network.
- Develop and execute partnership strategies to drive referrals and business opportunities.
- Collaborate with partners to co-create and promote joint offerings.
- Track and report on partnership performance metrics.
- Organize and participate in partner-focused events, webinars, and campaigns.
Requirements
Key Skills and Qualifications:
- Experience in partnership or business development roles.
- Strong relationship-building and negotiation skills.
- Understanding of e-commerce platforms and ecosystems.
- Excellent written and verbal communication skills.
- Ability to work independently and manage multiple relationships.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.
Role & Responsibilities:
Job Title: Senior Associate L1 – Data Engineering
Your role is focused on Design, Development and delivery of solutions involving:
• Data Ingestion, Integration and Transformation
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies
2.Minimum 1.5 years of experience in Big Data technologies
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
7.Cloud data specialty and other related Big data technology certifications
Job Title: Senior Associate L1 – Data Engineering
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes


Skills Required- Angular 2+, AngularJs, Javascript, HTML/CSS, jQuery, AJAX, Bootstrap
Additional skills(Good to have): Java sprint boot, rest api development, Sql/NoSql db knowledge
Tremendous opportunity to make impact on business and ad-tech industry.
Benefits:
Build shit that matters!!!
Experience the impact of your hard work
Work hard and party harder
Work with a extremely committed group of people
Explore and implement new technologies

- Meeting with the design team to review website and application requirements.
- Setting tasks and development goals.
- Configuring the company SharePoint systems to specified requirements.
- Developing new web components using XML, .NET, SQL, and C#.
- Designing, coding, and implementing scalable applications.
- Extending SharePoint functionality with forms, web parts, and application technologies.


About the role
Checking quality is one of the most important tasks at Anakin. Our clients are pricing their products based on our data, and minor errors on our end can lead to our client's losses of millions of dollars. You would work with multiple tools and with people across various departments to ensure the accuracy of the data being crawled. You would setup manual and automated processes and make sure they run to ensure the highest possible data quality.
You are the engineer other engineers can count on. You embrace every problem with enthusiasm. You remove hurdles, are a self-starter, team player. You have the hunger to venture into unknown areas and make the system work.
Your Responsibilities would be to:
- Understand customer web scraping and data requirements; translate these into test approaches that include exploratory manual/visual testing and any additional automated tests deemed appropriate
- Take ownership of the end-to-end QA process in newly-started projects
- Draw conclusions about data quality by producing basic descriptive statistics, summaries, and visualisations
- Proactively suggest and take ownership of improvements to QA processes and methodologies by employing other technologies and tools, including but not limited to: browser add-ons, Excel add-ons, UI-based test automation tools etc.
- Ensure that project requirements are testable; work with project managers and/or clients to clarify ambiguities before QA begins
- Drive innovation and advanced validation and analytics techniques to ensure data quality for Anakin's customers
- Optimize data quality codebases and systems to monitor the Anakin family of app crawlers
- Configure and optimize the automated and manual testing and deployment systems used to check the quality of billions of data points of over 1000+ crawlers across the company
- Analyze data and bugs that require in-depth investigations
- Interface directly with external customers including managing relationships and steering requirements
Basic Qualifications:
- 2+ years of experience as a backend or a full-stack software engineer
- Web scraping experience with Python or Node.js
- 2+ years of experience with AWS services such as EC2, S3, Lambda, etc.
- Should have managed a team of software engineers
- Should be paranoid about data quality
Preferred Skills and Experience:
- Deep experience with network debugging across all OSI layers (Wireshark)
- Knowledge of networks or/and cybersecurity
- Broad understanding of the landscape of software engineering design patterns and principles
- Ability to work quickly and accurately in a highly stressful environment during removing bugs in run-time within minutes
- Excellent communicator, both written and verbal
Additional Requirements:
- Must be available to work extended hours and weekends when needed to meet critical deadlines
- Must have an aversion to politics and BS. Should let his/her work speak for him/her.
- Must be comfortable with uncertainty. In almost all the cases, your job will be to figure it out.
- Must not be bounded to comfort zone. Often, you will need to challenge yourself to go above and beyond.
Role Purpose:
As a DevOps , You should be strong in both the Dev and Ops part of DevOps. We are looking for someone who has a deep understanding of systems architecture, understands core CS concepts well, and is able to reason about system behaviour rather than merely working with the toolset of the day. We believe that only such a person will be able to set a compelling direction for the team and excite those around them.
If you are someone who fits the description above, you will find that the rewards are well worth the high bar. Being one of the early hires of the Bangalore office, you will have a significant impact on the culture and the team; you will work with a set of energetic and hungry peers who will challenge you, and you will have considerable international exposure and opportunity for impact across departments.
Responsibilities
- Deployment, management, and administration of web services in a public cloud environment
- Design and develop solutions for deploying highly secure, highly available, performant and scalable services in elastically provisioned environments
- Design and develop continuous integration and continuous deployment solutions from development through production
- Own all operational aspects of running web services including automation, monitoring and alerting, reliability and performance
- Have direct impact on running a business by thinking about innovative solutions to operational problems
- Drive solutions and communication for production impacting incidents
- Running technical projects and being responsible for project-level deliveries
- Partner well with engineering and business teams across continents
Required Qualifications
- Bachelor’s or advanced degree in Computer Science or closely related field
- 4 - 6 years professional experience in DevOps, with at least 1/2 years in Linux / Unix
- Very strong in core CS concepts around operating systems, networks, and systems architecture including web services
- Strong scripting experience in Python and Bash
- Deep experience administering, running and deploying AWS based services
- Solid experience with Terraform, Packer and Docker or their equivalents
- Knowledge of security protocols and certificate infrastructure.
- Strong debugging, troubleshooting, and problem solving skills
- Broad experience with cloud hosted applications including virtualization platforms, relational and non relational data stores, reverse proxies, and orchestration platforms
- Curiosity, continuous learning and drive to continually raise the bar
- Strong partnering and communication skills
Preferred Qualifications
- Past experience as a senior developer or application architect strongly preferred.
- Experience building continuous integration and continuous deployment pipelines
- Experience with Zookeeper, Consul, HAProxy, ELK-Stack, Kafka, PostgreSQL.
- Experience working with, and preferably designing, a system compliant to any security framework (PCI DSS, ISO 27000, HIPPA, SOC 2, ...)
- Experience with AWS orchestration services such as ECS and EKS.
- Experience working with AWS ML pipeline services like AWS Sagemak

- 4+ years of experience Solid understanding of Python, Java and general software development skills (source code management, debugging, testing, deployment etc.).
- Experience in working with Solr and ElasticSearch Experience with NLP technologies & the handling of unstructured text Detailed understanding of text pre-processing and normalisation techniques such as tokenisation, lemmatisation, stemming, POS tagging etc.
- Prior experience in implementation of traditional ML solutions - classification, regression or clustering problem Expertise in text-analytics - Sentiment Analysis, Entity Extraction, Language modelling - and associated sequence learning models ( RNN, LSTM, GRU).
- Comfortable working with deep-learning libraries (eg. PyTorch)
- Candidate can even be a fresher with 1 or 2 years of experience IIIT, IIIT, Bits Pilani, top 5 local colleges are preferred colleges and universities.
- A Masters candidate in machine learning.
- Can source candidates from Mu Sigma and Manthan.


Driving design and innovation in the user-facing application to manage Yulus’ fleet, you will be working on Yulu Mobile Application which will include Maps, interaction with IoT devices via Bluetooth, and various other features. You will use your expertise in application development to evaluate and select development methods, processes, standard methodologies and tools. An eye for detail, Pixel perfection
and walking the extra mile to deliver a great user experience is essential.
Key Responsibilities
● Designing and building mobile applications for Apple’s iOS platform.
● Collaborating with the design team to define app features.
● Develop test specs and approaches for the application
● Investigate and resolve performance issues, and inefficiencies
● Ensuring quality and performance of the application to specifications.
● Identifying potential problems and resolving application bottlenecks.
● Fixing application bugs before the final release.
● Understand the market and participate in product roadmap discussions
Key Requirements
● Degree from a top engineering college, or equivalent technical background is preferred
● Agility and ability to adapt quickly to changing requirements, scope and priorities
● 2-4 years of industry experience in iOS Mobile Application design and development, with minimum 2 apps deployed in App Store
● A deep familiarity with Swift. Experience working with iOS frameworks such as Maps, core Location, core Bluetooth and Core Animation
● Strong UX/UI design exposure and experience in making apps work intuitively
● Ability to identify issues and improve application performance
● Experience in the usage of instruments to detect memory leaks for performance optimization
● Develop unit and functional test cases
● Familiar with the following – Git repository, Restful API, MVC, MVP, MVVM
● Strong CS fundamentals (with competencies in algorithms and data structures)
● Experience with third-party libraries and APIs Solid understanding of the full mobile development life cycle.
● Highly accountable and takes ownership, with a collaborative attitude, and a lifelong learner
To Sale product

