
accountant to manage all our financial transactions, from fixed and variable expenses to budgets and bank deposits. Knowledge of GST and TDS

Similar jobs
Position: QA Automation Engineer
Location: Mumbai, India
Experience: 2+ Years
Type: Full-Time
Company: Deqode
Overview:
Deqode is seeking a skilled QA Automation Engineer with a passion for quality, automation, and robust testing practices. This role is ideal for professionals from product-based software development companies who have worked on e-commerce platforms.
Required Skills:
- Proficiency in Selenium (Grid, parallel execution), TestNG.
- Experience with API testing (Postman, RestAssured).
- Strong SQL and backend data validation.
- Experience using Git, Jira, Asana.
- Familiarity with Jenkins, Confluence.
- Understanding of cross-browser testing.
- Exposure to Mocha_Chai for frontend/backend automation is a plus.
- Eye for design and UX consistency.
- Strong written and verbal communication.
Preferred Background:
- Must be from a product-based software development company.
- Prior experience in e-commerce projects is a major plus.
- Ability to work on time-critical and fast-paced projects.
Key Responsibilities:
- Design and maintain automated test frameworks for web and API applications.
- Perform manual and automated tests to ensure product quality.
- Build and execute test cases using Selenium with TestNG.
- Conduct comprehensive REST API testing.
- Write and optimize complex SQL queries for test data validation.
- Use Jira/Asana for issue tracking and documentation.
- Collaborate using Confluence for test documentation.
- Execute tests across multiple browsers and operating systems.
- Participate in Agile processes like sprint planning and retrospectives.
- Identify and troubleshoot issues during testing.
- Maintain CI pipelines using Jenkins.


Job Title : Python Developer – API Integration & AWS Deployment
Experience : 5+ Years
Location : Bangalore
Work Mode : Onsite
Job Overview :
We are seeking an experienced Python Developer with strong expertise in API development and AWS cloud deployment.
The ideal candidate will be responsible for building scalable RESTful APIs, automating power system simulations using PSS®E (psspy), and deploying automation workflows securely and efficiently on AWS.
Mandatory Skills : Python, FastAPI/Flask, PSS®E (psspy), RESTful API Development, AWS (EC2, Lambda, S3, EFS, API Gateway), AWS IAM, CloudWatch.
Key Responsibilities :
Python Development & API Integration :
- Design, build, and maintain RESTful APIs using FastAPI or Flask to interface with PSS®E.
- Automate simulations and workflows using the PSS®E Python API (psspy).
- Implement robust bulk case processing, result extraction, and automated reporting systems.
AWS Cloud Deployment :
- Deploy APIs and automation pipelines using AWS services such as EC2, Lambda, S3, EFS, and API Gateway.
- Apply cloud-native best practices to ensure reliability, scalability, and cost efficiency.
- Manage secure access control using AWS IAM, API keys, and implement monitoring using CloudWatch.
Required Skills :
- 5+ Years of professional experience in Python development.
- Hands-on experience with RESTful API development (FastAPI/Flask).
- Solid experience working with PSS®E and its psspy Python API.
- Strong understanding of AWS services, deployment, and best practices.
- Proficiency in automation, scripting, and report generation.
- Knowledge of cloud security and monitoring tools like IAM and CloudWatch.
Good to Have :
- Experience in power system simulation and electrical engineering concepts.
- Familiarity with CI/CD tools for AWS deployments.
Ready to take the next step in your professional journey? Join the 1Point1 family and nurture your future with us. From personal development to career growth, we provide the soil for your success! Let’s grow together and make every day a new opportunity to blossom.



Dynamic technical leader who can lead a cutting-edge product development team that is tasked with building world class business applications. This role reports to the CTO and will be responsible for engineering one of the key products lines of SmartDocs.
Key Responsibilities:
- Manage the end-to-end engineering process to ensure the product releases are released successfully as per the published product roadmap
- Coordinate requirements prioritization and product managers
- Release products with proper quality
- Coordinate code reviews and ensure proper development guidelines are adhered to
- Coordinate QA testing with QA teams
- Coordinate proper documentation is released
- Coordinate with DevOps engineers to maintain proper development landscape
- Nurture and expand the dynamic development team that is nimble and can meet the requirements of the product managers
Key experience and required skills:
- At least 2 years' experience managing successful development teams building modern web applications in the role of “Developer Manager” or Engineering Manager” or higher
- Experience as a senior Java developer for at least 2 years with full hands-on development experience
- Excellent understanding of modern web application architectures and architecture patterns
- Expertise in Agile and agile-type processes
- Experience being part of SaaS product development is a definite plus
- Significant experience in management reporting and communication to executive leadership teams
Success Factors
The ideal candidate would have been part of one or more startups that delivered modern web applications, preferably SaaS products in enterprise space. The ideal candidate will be a proper mix of management skills and technical prowess.
What we offer
SmartDocs is a modern progressive product company with the mission of delivering awesome Procure to Pay platform to medium and large enterprises to optimize and streamline business processes. We offer:
- Awesome workplace
- Cool benefits
- Cutting tech stack
- Socially Impactful products
- Multiple learning opportunities
What we expect
We are looking for a dynamic leader with the following key traits
- Responsibility
- Accountability
- Flexibility
- Team player
- Awesome communication skills
Egnyte is seeking an experienced Sr. Software Engineer to join our Software Engineering (Infrastructure) group. The Software Engineering (Infrastructure) group builds large distributed components and services that run Egnyte’s Cloud Platform. Our code serves billions of requests per day with sub-second latency in a fault-tolerant environment. Some of the responsibilities for this group include Egnyte’s Cloud File System, Object Store, Metadata Stores, Search Systems, Recommendations Systems, Synchronization, and intelligent caching of multi-petabyte datasets. We are looking for candidates with a shared passion for building large-scale distributed systems and a keen sense for tackling complexities that come with scaling through multiple orders of magnitude.
What You’ll Do (but is not limited to)
- Design and develop highly-scalable elastic cloud architecture that seamlessly integrates with on-premises systems
- Challenge and redefine existing architectural fundamentals in order to provide next level of performance and scalability; ability to foresee post-deployment design challenges, performance and scale bottlenecks
- Work with multicultural, geographically distributed teams and closely coordinate with cross-functional teams in multiple time zones.
- Deliver enterprise-grade products to customers and continuously work with engineering team to refine products in the field
- Mentor interns and junior engineers, collaborate with Operations, and work closely with CTO on roadmap items
- Extensive penetration testing to ensure security across a hybrid deployment between public/private cloud
- Monitor and manage 3,000+ nodes using modern DevOps tools and APM solutions
- Proactive performance and exception analysis
Your Qualifications
- 5+ years of relevant industry work Experience
- Demonstrated success designing and developing complex systems
- Expertise with multi-tenant, highly complex, cloud solutions
- Experience owning all aspects of software engineering, from design to implementation, QA and maintenance.
- Experience with the following technologies: Java, SQL, Linux, Python, HBase/BigTable
- Data driven decision process
- Relies on unit testing instead of manual QA
- Knowledge of DevOps techniques
- BS or MS degree in Computer Science or related field
Bonus Skills
- Experience with Hybrid and/or on-premises solutions
- Experience in working with AWS or GCP
- Experience with the following technologies:, Nginx, Haproxy, BigQuery, New Relic, Graphite, and/or Puppet
- Security / Governance expertise
About Egnyte
In a content critical age, Egnyte fuels business growth by enabling content-rich business processes, while also providing organizations with visibility and control over their content assets. Egnyte’s cloud-native content services platform leverages the industry’s leading content intelligence engine to deliver a simple, secure, and vendor-neutral foundation for managing enterprise content across business applications and storage repositories. More than 16,000 customers trust Egnyte to enhance employee productivity, automate data management, and reduce file-sharing cost and complexity. Investors include Google Ventures, Kleiner Perkins, Caufield & Byers, and Goldman Sachs. For more information, visithttp://www.egnyte.com/"> http://www.egnyte.com/">www.egnyte.com
Responsibilities
· Resolving technical support inquiries directly or indirectly through on-site and/or remote first-level support representatives.
· Performing routine assignments in the entry level to a professional job progression.
· Providing prompt recovery and problem escalation using multiple system management and problem management tools.
· Working closely with development teams, support teams and vendors to coordinate special operations, and communicate/escalate problems to meet assigned deadlines.
· Providing analytical support for the development & enhancement of system interfaces and develop/enhance those tools under technical direction from seniors.
· Assisting in daily support of the systems/products assigned, through early detection and pursuit of changes in system responses or operations.
· Working closely with support groups to refine system monitoring, reporting, and to assist them in their analysis and problem recovery.
· Providing prompt, clear, and timely communications; log, coordinate, and communicate problems using standard problem management and escalation procedures.
- Creating and managing ETL/ELT pipelines based on requirements
- Build PowerBI dashboards and manage datasets needed.
- Work with stakeholders to identify data structures needed for future and perform any transformations including aggregations.
- Build data cubes for real-time visualisation needs and CXO dashboards.
Required Tech Skills
- Microsoft PowerBI & DAX
- Python, Pandas, PyArrow, Jupyter Noteboks, ApacheSpark
- Azure Synapse, Azure DataBricks, Azure HDInsight, Azure Data Factory
About Kloud9:
Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.
Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.
At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.
Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.
We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.
What we are looking for:
● 3+ years’ experience developing Big Data & Analytic solutions
● Experience building data lake solutions leveraging Google Data Products (e.g. Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.), Hive, Spark
● Experience with relational SQL/No SQL
● Experience with Spark (Scala/Python/Java) and Kafka
● Work experience with using Databricks (Data Engineering and Delta Lake components)
● Experience with source control tools such as GitHub and related dev process
● Experience with workflow scheduling tools such as Airflow
● In-depth knowledge of any scalable cloud vendor(GCP preferred)
● Has a passion for data solutions
● Strong understanding of data structures and algorithms
● Strong understanding of solution and technical design
● Has a strong problem solving and analytical mindset
● Experience working with Agile Teams.
● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders
● Able to quickly pick up new programming languages, technologies, and frameworks
● Bachelor’s Degree in computer science
Why Explore a Career at Kloud9:
With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers!
- Consistently brainstorm and collaborating with the team for new ideas and strategies.
- Researching markets and industries to compare trends in the market.
- Organized Working Approach
- Experience in Public Relations
- Extremely Creative in Handling Social Media Marketing
- Online Advertising Management
- Creative Content Writing
- Influencer Collaboration for YouTube
- Create Marketing Campaign to Ensure Growth of YouTube



About us
DataWeave provides Retailers and Brands with “Competitive Intelligence as a Service” that enables them to take key decisions that impact their revenue. Powered by AI, we provide easily consumable and actionable competitive intelligence by aggregating and analyzing billions of publicly available data points on the Web to help businesses develop data-driven strategies and make smarter decisions.
Data Science@DataWeave
We the Data Science team at DataWeave (called Semantics internally) build the core machine learning backend and structured domain knowledge needed to deliver insights through our data products. Our underpinnings are: innovation, business awareness, long term thinking, and pushing the envelope. We are a fast paced labs within the org applying the latest research in Computer Vision, Natural Language Processing, and Deep Learning to hard problems in different domains.
How we work?
It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the web. At serious scale!
What do we offer?
● Some of the most challenging research problems in NLP and Computer Vision. Huge text and image
datasets that you can play with!
● Ability to see the impact of your work and the value you're adding to our customers almost immediately.
● Opportunity to work on different problems and explore a wide variety of tools to figure out what really
excites you.
● A culture of openness. Fun work environment. A flat hierarchy. Organization wide visibility. Flexible
working hours.
● Learning opportunities with courses and tech conferences. Mentorship from seniors in the team.
● Last but not the least, competitive salary packages and fast paced growth opportunities.
Who are we looking for?
The ideal candidate is a strong software developer or a researcher with experience building and shipping production grade data science applications at scale. Such a candidate has keen interest in liaising with the business and product teams to understand a business problem, and translate that into a data science problem.
You are also expected to develop capabilities that open up new business productization opportunities.
We are looking for someone with a Master's degree and 1+ years of experience working on problems in NLP or Computer Vision.
If you have 4+ years of relevant experience with a Master's degree (PhD preferred), you will be considered for a senior role.
Key problem areas
● Preprocessing and feature extraction noisy and unstructured data -- both text as well as images.
● Keyphrase extraction, sequence labeling, entity relationship mining from texts in different domains.
● Document clustering, attribute tagging, data normalization, classification, summarization, sentiment
analysis.
● Image based clustering and classification, segmentation, object detection, extracting text from images,
generative models, recommender systems.
● Ensemble approaches for all the above problems using multiple text and image based techniques.
Relevant set of skills
● Have a strong grasp of concepts in computer science, probability and statistics, linear algebra, calculus,
optimization, algorithms and complexity.
● Background in one or more of information retrieval, data mining, statistical techniques, natural language
processing, and computer vision.
● Excellent coding skills on multiple programming languages with experience building production grade
systems. Prior experience with Python is a bonus.
● Experience building and shipping machine learning models that solve real world engineering problems.
Prior experience with deep learning is a bonus.
● Experience building robust clustering and classification models on unstructured data (text, images, etc).
Experience working with Retail domain data is a bonus.
● Ability to process noisy and unstructured data to enrich it and extract meaningful relationships.
● Experience working with a variety of tools and libraries for machine learning and visualization, including
numpy, matplotlib, scikit-learn, Keras, PyTorch, Tensorflow.
● Use the command line like a pro. Be proficient in Git and other essential software development tools.
● Working knowledge of large-scale computational models such as MapReduce and Spark is a bonus.
● Be a self-starter—someone who thrives in fast paced environments with minimal ‘management’.
● It's a huge bonus if you have some personal projects (including open source contributions) that you work
on during your spare time. Show off some of your projects you have hosted on GitHub.
Role and responsibilities
● Understand the business problems we are solving. Build data science capability that align with our product strategy.
● Conduct research. Do experiments. Quickly build throw away prototypes to solve problems pertaining to the Retail domain.
● Build robust clustering and classification models in an iterative manner that can be used in production.
● Constantly think scale, think automation. Measure everything. Optimize proactively.
● Take end to end ownership of the projects you are working on. Work with minimal supervision.
● Help scale our delivery, customer success, and data quality teams with constant algorithmic improvements and automation.
● Take initiatives to build new capabilities. Develop business awareness. Explore productization opportunities.
● Be a tech thought leader. Add passion and vibrance to the team. Push the envelope. Be a mentor to junior members of the team.
● Stay on top of latest research in deep learning, NLP, Computer Vision, and other relevant areas.

