
1.
As a member of the team, you will work alongside our clients to help design and deliver Test solutions to meet business requirements and help to improve the effectiveness of the clients' software quality assurance process.
2.
You should have extensive experience delivering testing services on commercial projects and possess extensive knowledge of agile practices.
3.
You are a techie who is passionate about software quality. You take pride in being hands-on and constantly endeavor to meet/exceed the software quality expectations of clients.
4.
You have 6+ years of experience in Quality Assurance including some experience in automation.
5.
You have exceptional analytical skills. While your projects may be extremely diverse in terms of their domain, scope, and requirement, you still must have the ability to comprehensively analyze those business problems and propose solutions accordingly.
6.
You have led QA efforts on your projects. This means you've defined testing strategies, test effort estimation, worked on automation frameworks, coordinated with client teams and have been involved till the deployment of projects.
7.
You are well experienced in bug reporting tools like Jira, Azure DevOps, Bugzilla, and API testing tools like Postman, Swagger, SoapUI, etc.
8.
Experience in using automation tools like Selenium Webdriver/Cypress/Appium to automate enterprise software applications would be an added advantage but it is not a must-have.
9.
You have excellent communication skills, task management skills, and client-facing capability and experience.
10.
You have worked on Agile projects and can advise clients on implementing agile testing processes.
11.
Experience in using any BDD frameworks for responsive web applications or mobile application testing.
12.
API integration testing would be an added advantage.
13.
Regardless of your experience, you are hands-on with testing and would like to remain hands-on as well.
14.
You are good with sharing knowledge and mentoring junior team members.

About Testrig Technologies
About
Connect with the team
Similar jobs
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Company Description
Appectual IT Solutions is a software development company based in Mumbai, specializing in creating usable, functional, and intuitive websites and apps. They are known for delivering impactful solutions powered by the latest technologies across various industries. Appectual has a track record of working with reputable brands like L'oreal, Reebok, Wockhardt, Apar Lubricants, and Birla, and is committed to finding the perfect solutions for their clients' needs.
Role Description
This is a full-time Hybrid role for as a Business Development Executive at Appectual IT Solutions in Mumbai. The Business Development Executive will be responsible for new business development, lead generation, business communication, and account management on a day-to-day basis.
Must-Have Skills:
Proposal Creation
Good conversion rate history
Relevant Experience: 6 Months- 2 Yrs
Expected Joining: 30 Days
Qualifications
New Business Development and Lead Generation skills
Strong business acumen and communication skills
Account Management expertise
Prior experience in sales or business development
Knowledge of the IT industry and software solutions
Ability to work well in a team and independently
Bachelor's degree in Business Administration, Marketing, or related field
• Dotnet, Web Developer
• ASP.NET MVC
• Web API
• SQL Server
• C#
• Javascript
• JQuery
Looking for immediate joiners for Pune location only who are having good experience into B2b, Saas and in Field sales.
Team handling experience is compulsory.
CupShup is looking for Operations Executive for the respective city. You will be responsible to handle the network growth, BTL Campaign function & operations, audit i. e. internal audits, client audits, cost audits, etc.
Responsibilities:
- Exeuctiion of BTL / Offline marketing campaigns across locations
- Strengthening network of promoters
- Vendor/Partners Empanelment
- Logistics and Collaterals
- Success of the campaigns
- Process improvement and streamlining
- GST Compliance and reconciliation
- For Client - Media, Financial, Rate Benchmarking, Competition
- Functional Audits as and when required by Management
- Regional reporting
- Network tracker
- Collaterals Tracker
- Partners Tracker
Requirements:
- Graduate
- 0-2 years of experience BTL marketing is a must
- Self Motivated, Passionate, and a never give up attitude
- Team player
- Fair at communication and expression
- Willing to learn not only operations but also marketing and the other functions
- Two wheeler driving and license is an added advantage
Requirements -
-
2+ years of relevant work experience as a Developer or SDET
-
Fluency in Java and Javascript test Automation
-
Comprehensive knowledge of unit, integration and functional testing
-
Experience working with tools such as Cypress
-
Hands-on experience in writing API automation scripts and unit test scripts
-
Experience in independently building test automation frameworks for Web and Mobile
-
Good knowledge of databases and querying language
-
Experience of working “AGILE + DevOps” process management methodology
-
Exposure to Continuous Integration Process Tools like Jenkins/CircleCI
-
Understanding of REST services and proficiency with REST tools and libraries
(Restassured and Postman)
-
Team Player
-
Good Verbal and Written communication skills
Good to Have -Familiar with Startup culture and work expectations
Blume Global (HQ California, www.blumeglobal.com) is a disrupter in the supply chain software market and has built the next generation cloud first Digital Supply Chain Platform for fortune 500 companies. Blume Global uses its 25+ years of data insights and global network to help enterprises be more agile, improve service delivery and reduce cost removing significant wastage from their operations.
Role Summary:
As an experienced Analyst, you will:
Primarily making our customers think you are magical by resolving complex problems through your technical and product expertise. When
you learn more about our product suite, you will be able to extend your depth of knowledge on the products you support, as well as
expand to new technology stacks and supply chain domain knowledge. To hone your technical prowess, dig deep into database, data files,
logs, and traces to find the source of any problem. Finally, you will be someone our customers trust. They will depend on you to provide
timely and accurate information to their application issues.
Responsibilities:
• Prior experience of working in Application/Production support environment.
• MySQL knowledge and SQL querying abilities are needed. Skills in Python scripting would be advantageous.
• Troubleshooting and developing new solutions that solve the root cause of customer problems in tickets elevated from our L1 support team. Work Independently in the team.
• Problem Management (identifying recurring incidents, notify L3 for permeant fixes)
• Along with Customer Success Manager, participate in Weekly & Monthly reviews with the customers.
• Writing step-by-step processes, technical solutions, and ticket updates to customers using clear and concise English.
• Study ticket patterns and suggest improvement. Identify areas that can be automated.
• Experience in Application support ticketing tool such as ServiceNow & Jira
• Thorough understanding of SLA Management & Operational reporting.
• Provide value to the Customer in line to Quality, Process Improvements & Other customer centric initiatives.
• ITIL V3 Foundation Certified and through in-Service management processes, Event Management, Incident, Problem Management & Change Management.
Skills and Experience :
- 2-7 years of experience as a Java/J2EE developer.
- 1-3 years of experience with Angular / React is desirable.
- 2-5 years of experience in using Spring and Spring Boot frameworks.
- Thorough knowledge of server-side development.
- Proven experience as a Full Stack Developer or similar role.
- Good understanding of web services (WSDL SOAP, RESTful).
- Hands-on experience in using Application Servers like WebSphere.
- Expertise in relational databases (Oracle, SQL Server).
- AI/ML domain knowledge is desirable.
- Familiarity with common stacks.
- Knowledge of multiple frontend languages and libraries, like HTML/ CSS, JavaScript, XML, jQuery.
- Experience in implementation of Microservices
- Experience with AWS (S3, SQS, SNS, ECS, EC2, ALB, API Gateway, Lambda, etc.) is highly desirable
- Good understanding of Docker & Kubernetes is highly desired.
- Familiarity with databases (MySQL, MongoDB, PostgresSQL), web servers (Apache), and UI/UX designs.
- Excellent communication and teamwork skills.
About LevaData
LevaData, the Cognitive Sourcing Platform, offers global enterprises the ability to improve gross margins by reducing supply chain costs, with a focus on delivering measurable and accountable supply chain solutions and strategies that transform companies and markets. Customers include leaders in the top global supply chain organizations, as well as medium-sized OEMs seeking to achieve best-in-class direct materials sourcing practices. LevaData is privately held and headquartered in Sunnyvale, Calif. For more information, visit https://www.levadata.com/">www.levadata.com.
Levadata Clientele: IBM, Fitbit, Lenovo, Zebra, Poly, Commscope
LevaData Fundings:
June, 2018: Series B, $12 million.
Aug, 2017: Series A, $5 million.
- Writing financial documents, news items, articles, research Reports.
- Keeping updated on various financial regulatory and global activities.
- Keeping watch on various global stock markets especially US Market.
- Writing various financial documents for clients majorly Fintech Companies, agencies.
- Developing content for print, online, and presentation materials.
What you need to have:
- Minimum of 1 year experience in Financial report writing
- Exposure to US financial/Security market
- Candidates from financial background or experience in financial content writing - articles, blogs and from financial news candidates/financial writing companies.










