
We are looking for self-driven individuals with great problem solving and customer service skills to fill Associate positions within the Operations group. The successful candidate will be responsible for providing day-to-day customer support for all services supported within the Stockal platform. Must be well versed in Excel.
Responsibilities:
Direct interaction with clients globally to resolve technical and non-technical related queries.
Assist clients to comply and complete onboarding on the Stockal Platform.
Respond to client requests- inbound, outbound, chat, email.
Requirements:
University graduate preferred.
Customer services experience or an equivalent combination of education and experience from which comparable knowledge can be acquired.
Previous experience working with financial services and products preferred.
Excellent interpersonal, written and verbal communication skills.
Teamwork.
Customer service and satisfaction are key priorities for the firm.
Candidates must be aware of the impact of their actions on internal and external clients.
Flexible working in shifts.

About Borderless Softech Pvt Ltd
About
Stockal helps retail investors in India, Middle East and South East Asia “globalize” their savings and wealth by saving and/or investing money in mature international markets such as US. Using Stockal, people can, digitally, open, own and operate overseas investing accounts - digitally and affordably. Additionally, investing on Stockal is backed by big data-based automated research and analysis that helps investors make smarter investing decisions. Stockal helps you manage your portfolio better, keep track of its health, make better investing decisions and discover interesting stocks and investing themes every day so that you don't have to scour through reams of data or depend on unreliable guesswork. Stockal is funded by investors with deep backgrounds, and rich experience, in the global financial services industry.
Why Work At Stockal?
One Amazing Blue Team
Work Life Jenga
Get Mentored by the best in the Industry
Health Insurance
Paid Time Off
Paternal Leaves
Work From Home
Company video
Connect with the team
Similar jobs
hiring for the position of Backend Developer / Senior Backend Developer responsible for managing back-end services and the interchange of data between the server and the users.
Your primary focus will be the development of all server side logic, definition and maintenance of the central database, and ensuring high performance and responsiveness to requests from the front-end.
We are looking to recruit a candidate for a role that will require:
- Create and consume restful APIs
- Design, develop, and maintain internal and external applications
- Build efficient, testable, and reusable modules
- Write high quality, structured application/interface code and documentation
- Identify solutions through research and collaboration that resolves the root of problems as they arise
- Define functional and technical requirements for application software to develop skills and knowledge
- Troubleshoot, test and maintain the core product software and databases to ensure strong optimization and functionality
- Contribute in all phases of the development lifecycle
Requirement
∙ Proficient in Node.JS
∙ 2+ years’ experience designing, querying, and updating databases in MySQL/nosql
∙ Passion for best design and coding practices and a desire to develop new bold ideas
∙ Good to have knowledge of AWS, Redis, ElasticSearch
Education: Min. Graduate in related discipline
Work experience: 2 years relevant experience
Compensation: Based on Industry StandardsInterview Mode - Face to Face
Requires that any candidate know the M365 Collaboration environment. SharePoint Online, MS Teams. Exchange Online, Entra and Purview. Need developer that possess a strong understanding of Data Structure, Problem Solving abilities, SQL, PowerShell, MS Teams App Development, Python, Visual Basic, C##, JavaScript, Java, HTML, PHP, C.
Experience: 4+ years.
Location: Vadodara & Pune
Skills Set- Snowflake, Power Bi, ETL, SQL, Data Pipelines
What you'll be doing:
- Develop, implement, and manage scalable Snowflake data warehouse solutions using advanced features such as materialized views, task automation, and clustering.
- Design and build real-time data pipelines from Kafka and other sources into Snowflake using Kafka Connect, Snowpipe, or custom solutions for streaming data ingestion.
- Create and optimize ETL/ELT workflows using tools like DBT, Airflow, or cloud-native solutions to ensure efficient data processing and transformation.
- Tune query performance, warehouse sizing, and pipeline efficiency by utilizing Snowflakes Query Profiling, Resource Monitors, and other diagnostic tools.
- Work closely with architects, data analysts, and data scientists to translate complex business requirements into scalable technical solutions.
- Enforce data governance and security standards, including data masking, encryption, and RBAC, to meet organizational compliance requirements.
- Continuously monitor data pipelines, address performance bottlenecks, and troubleshoot issues using monitoring frameworks such as Prometheus, Grafana, or Snowflake-native tools.
- Provide technical leadership, guidance, and code reviews for junior engineers, ensuring best practices in Snowflake and Kafka development are followed.
- Research emerging tools, frameworks, and methodologies in data engineering and integrate relevant technologies into the data stack.
What you need:
Basic Skills:
- 3+ years of hands-on experience with Snowflake data platform, including data modeling, performance tuning, and optimization.
- Strong experience with Apache Kafka for stream processing and real-time data integration.
- Proficiency in SQL and ETL/ELT processes.
- Solid understanding of cloud platforms such as AWS, Azure, or Google Cloud.
- Experience with scripting languages like Python, Shell, or similar for automation and data integration tasks.
- Familiarity with tools like dbt, Airflow, or similar orchestration platforms.
- Knowledge of data governance, security, and compliance best practices.
- Strong analytical and problem-solving skills with the ability to troubleshoot complex data issues.
- Ability to work in a collaborative team environment and communicate effectively with cross-functional teams
Responsibilities:
- Design, develop, and maintain Snowflake data warehouse solutions, leveraging advanced Snowflake features like clustering, partitioning, materialized views, and time travel to optimize performance, scalability, and data reliability.
- Architect and optimize ETL/ELT pipelines using tools such as Apache Airflow, DBT, or custom scripts, to ingest, transform, and load data into Snowflake from sources like Apache Kafka and other streaming/batch platforms.
- Work in collaboration with data architects, analysts, and data scientists to gather and translate complex business requirements into robust, scalable technical designs and implementations.
- Design and implement Apache Kafka-based real-time messaging systems to efficiently stream structured and semi-structured data into Snowflake, using Kafka Connect, KSQL, and Snow pipe for real-time ingestion.
- Monitor and resolve performance bottlenecks in queries, pipelines, and warehouse configurations using tools like Query Profile, Resource Monitors, and Task Performance Views.
- Implement automated data validation frameworks to ensure high-quality, reliable data throughout the ingestion and transformation lifecycle.
- Pipeline Monitoring and Optimization: Deploy and maintain pipeline monitoring solutions using Prometheus, Grafana, or cloud-native tools, ensuring efficient data flow, scalability, and cost-effective operations.
- Implement and enforce data governance policies, including role-based access control (RBAC), data masking, and auditing to meet compliance standards and safeguard sensitive information.
- Provide hands-on technical mentorship to junior data engineers, ensuring adherence to coding standards, design principles, and best practices in Snowflake, Kafka, and cloud data engineering.
- Stay current with advancements in Snowflake, Kafka, cloud services (AWS, Azure, GCP), and data engineering trends, and proactively apply new tools and methodologies to enhance the data platform.
Role & responsibilities :
Tech sales executives who have taken calls on Printer, Outlook, Antivirus, PPC
Preferred candidate profile :
Technical Sales Inbound PPC Calls (Printer, Antivirus, Tech Support)
Required Candidate Profile Salary Up to:- 50k+ incentives
Location: Noida
Perks and benefits : Incentives + Meal
What we look for:
Resourceful individuals with a creative spirit to sketch our brand image through vividly portrayed content.
Responsibilities:
• Develop persuasive SEO-optimized content for blogs, campaigns, and newsletters
• Create research-oriented industry-specific content for websites
• Proofreading content before delivery
• Identify customers’ needs and gaps in content and recommend new topics
• Generate fresh content ideas for social media
• Work with graphic designers to digitally deliver content concepts
Requirements:
• Graduate in any stream with exceptional writing skills
• Good communication and interpersonal skills
• Proficient with Microsoft Office Suite and Google Docs
• Ability to multitask, prioritize and manage time efficiently
Your Roles and Responsibilities:
Business Development is a critical aspect of our platform business.
1. Actively seeking out new sales opportunities through cold calling, networking, and social media.
2. Calling 65-70 leads every day
3. Setting up meetings with potential clients (parents)
4. Generating Trial Classes - Pitch Parents to take PlanetSpark Trial Classes
5. Negotiate/close deals and handle complaints or objections
6. Follow and achieve the department’s sales goals on a monthly, quarterly, and yearly basis (3L revenue per month)
7. “Go the extra mile” to drive sales
Job Description – Developer (ETL + Database)
Develop, document & Support ETL mappings, Database structures & BI reports.
Perform unit testing of developments done by him/her.
Participate in UAT process and ensure quick resolution of any UAT issue.
Manage different environments and be responsible for proper deployment of code in all client
environments.
Prepare release documents.
Prepare and Maintain project documents as advised by Team Leads.
Skill-sets:
3+ years of Hands on experience on ETL Pentaho Spoon Talend & MS SQL Server, Oracle & SYBASE Database tools.
Ability to write complex SQL and database procedures.
Good knowledge and understanding regarding Data warehouse Concepts, ETL Concepts, ETL
Loading Strategies, Data archiving, Data Reconciliation, ETL error handling etc.
Problem Solving.
Good communication skills – written and verbal.
Self-motivated, team player, action and result oriented.
Ability to successfully work under tight project schedule
Our client is a producer of superior quality products. Our client is the largest single location manufacturing facility in India with certifications of ISO 9001, 14001, OHSAS 18001 along with other product specific certifications.
What you will do:
- Tracking progress of any new initiatives and present business
- Supporting CEO in communications by preparing presentations and financial statements
- Researching on the current trends/ business models
- Helping prepare for meetings and accurately recording minutes from the same
- Preparing and analyzing required data and reports
- Coordinating with all department heads for the tasks given by CEO
- Being in charge of all the periodic communications from the CEO's office
- Ensuring for the timely relevant escalations
- Managing bandwidth and assisting the CEO in prioritization
Desired Candidate Profile
What you need to have:- 10 to 15 years of experience in plant preferred
- Age : 35 years and above
- Females are preferred
- Proficiency in MS office (word, excel, powerpoint), SAP, setting up video interviews, Gsuite, vlookup hlookup
- Good communication skills (written and verbal)
- Ability to work under pressure
- Attention to detail
- Ability to deal with people at various levels in the organisation
- Knowledge of short-hand is preferred
Role
We are looking for an experienced DevOps engineer that will help our team establish DevOps practice. You will work closely with the technical lead to identify and establish DevOps practices in the company.
You will also help us build scalable, efficient cloud infrastructure. You’ll implement monitoring for automated system health checks. Lastly, you’ll build our CI pipeline, and train and guide the team in DevOps practices.
This would be a hybrid role and the person would be expected to also do some application level programming in their downtime.
Responsibilities
- Deployment, automation, management, and maintenance of production systems.
- Ensuring availability, performance, security, and scalability of production systems.
- Evaluation of new technology alternatives and vendor products.
- System troubleshooting and problem resolution across various application domains and platforms.
- Providing recommendations for architecture and process improvements.
- Definition and deployment of systems for metrics, logging, and monitoring on AWS platform.
- Manage the establishment and configuration of SaaS infrastructure in an agile way by storing infrastructure as code and employing automated configuration management tools with a goal to be able to re-provision environments at any point in time.
- Be accountable for proper backup and disaster recovery procedures.
- Drive operational cost reductions through service optimizations and demand based auto scaling.
- Have on call responsibilities.
- Perform root cause analysis for production errors
- Uses open source technologies and tools to accomplish specific use cases encountered within the project.
- Uses coding languages or scripting methodologies to solve a problem with a custom workflow.
Requirements
- Systematic problem-solving approach, coupled with strong communication skills and a sense of ownership and drive.
- Prior experience as a software developer in a couple of high level programming languages.
- Extensive experience in any Javascript based framework since we will be deploying services to NodeJS on AWS Lambda (Serverless)
- Extensive experience with web servers such as Nginx/Apache
- Strong Linux system administration background.
- Ability to present and communicate the architecture in a visual form.
- Strong knowledge of AWS (e.g. IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda, NAT gateway, DynamoDB)
- Experience maintaining and deploying highly-available, fault-tolerant systems at scale (~ 1 Lakh users a day)
- A drive towards automating repetitive tasks (e.g. scripting via Bash, Python, Ruby, etc)
- Expertise with Git
- Experience implementing CI/CD (e.g. Jenkins, TravisCI)
- Strong experience with databases such as MySQL, NoSQL, Elasticsearch, Redis and/or Mongo.
- Stellar troubleshooting skills with the ability to spot issues before they become problems.
- Current with industry trends, IT ops and industry best practices, and able to identify the ones we should implement.
- Time and project management skills, with the capability to prioritize and multitask as needed.










