11+ Business partnership Jobs in Chennai | Business partnership Job openings in Chennai
Apply to 11+ Business partnership Jobs in Chennai on CutShort.io. Explore the latest Business partnership Job opportunities across top companies like Google, Amazon & Adobe.
Primary Responsibilities
- Support enterprise account executives with solution selling into prospect account base
- Conducting in-depth product demonstrations of our solutions
- Presenting the products at events, conferences, seminars and webinars
- Partner with sales executives to plan, prepare and execute on strategic deals involving complex sales cycles
- Model the financial business case and ROI associated with each sales opportunity
- Successfully match customer pain/requirements to proposed solutions
- Create and deliver powerful presentations and demos that clearly communicate the uniqueness of the value proposition
- Manage all technical aspects of RFP / RFI responses
- Effectively communicate client needs to Customer Success team during handover as well as to Product and Product Marketing teams for future product enhancements
- Collect and document competitive intelligence
- Work closely with Account Managers to plan expansion within strategic accounts
Requirements - Proven work experience as a Sales Engineer at a B2B SaaS company, preferably in the cloud infrastructure domain
- Creativity to approach sales and build customer relationships in ground-breaking new ways
- Excellent written and oral communication skills
- Excellent organizational skills and a keen eye for detail
- Deeply data-driven with a growth mindset
- Negotiation and social problem-solving skills
- Ability to work in a complex sales environment and multitask
- Proven track record selling complex enterprise solutions
- Ability to forge strong, long-lasting relationships with senior executives
- Ability to creatively explain and present complex concepts in an easy to understand manner
- Solid technical background with understanding and/or hands-on experience in cloud technologies (Such as AWS and/or Azure and/or GCP, Automation, Orchestration, CI/CD tools, Scripting etc.)
- Excellent presentation and creativity skills
- Willingness to travel as needed
- BA/BS degree or equivalent
1. GCP - GCS, PubSub, Dataflow or DataProc, Bigquery, BQ optimization, Airflow/Composer, Python(preferred)/Java
2. ETL on GCP Cloud - Build pipelines (Python/Java) + Scripting, Best Practices, Challenges
3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP
4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (At Least 2 databases)
5. Data Warehouse concepts - Beginner to Intermediate level
6.Data Modeling, GCP Databases, DB Schema(or similar)
7.Hands-on data modelling for OLTP and OLAP systems
8.In-depth knowledge of Conceptual, Logical and Physical data modelling
9.Strong understanding of Indexing, partitioning, data sharding, with practical experience of having done the same
10.Strong understanding of variables impacting database performance for near-real-time reporting and application interaction.
11.Should have working experience on at least one data modelling tool,
preferably DBSchema, Erwin
12Good understanding of GCP databases like AlloyDB, CloudSQL, and
BigQuery.
13.People with functional knowledge of the mutual fund industry will be a plus Should be willing to work from Chennai, office presence is mandatory
Role & Responsibilities:
● Work with business users and other stakeholders to understand business processes.
● Ability to design and implement Dimensional and Fact tables
● Identify and implement data transformation/cleansing requirements
● Develop a highly scalable, reliable, and high-performance data processing pipeline to extract, transform and load data from various systems to the Enterprise Data Warehouse
● Develop conceptual, logical, and physical data models with associated metadata including data lineage and technical data definitions
● Design, develop and maintain ETL workflows and mappings using the appropriate data load technique
● Provide research, high-level design, and estimates for data transformation and data integration from source applications to end-user BI solutions.
● Provide production support of ETL processes to ensure timely completion and availability of data in the data warehouse for reporting use.
● Analyze and resolve problems and provide technical assistance as necessary. Partner with the BI team to evaluate, design, develop BI reports and dashboards according to functional specifications while maintaining data integrity and data quality.
● Work collaboratively with key stakeholders to translate business information needs into well-defined data requirements to implement the BI solutions.
● Leverage transactional information, data from ERP, CRM, HRIS applications to model, extract and transform into reporting & analytics.
● Define and document the use of BI through user experience/use cases, prototypes, test, and deploy BI solutions.
● Develop and support data governance processes, analyze data to identify and articulate trends, patterns, outliers, quality issues, and continuously validate reports, dashboards and suggest improvements.
● Train business end-users, IT analysts, and developers.
Job Overview
We are looking for a detail-oriented and skilled QA Engineer with expertise in Cypress to join our Quality Assurance team. In this role, you will be responsible for creating and maintaining automated test scripts to ensure the stability and performance of our web applications. You’ll work closely with developers, product managers, and other QA professionals to identify issues early and help deliver a high-quality user experience.
You should have a strong background in test automation, excellent analytical skills, and a passion for improving software quality through efficient testing practices.
Key Responsibilities
- Develop, maintain, and execute automated test cases using Cypress.
- Design robust test strategies and plans based on product requirements and user stories.
- Work with cross-functional teams to identify test requirements and ensure proper coverage.
- Perform regression, integration, smoke, and exploratory testing as needed.
- Report and track defects, and work with developers to resolve issues quickly.
- Collaborate in Agile/Scrum development cycles and contribute to sprint planning and reviews.
- Continuously improve testing tools, processes, and best practices.
- Optimize test scripts for performance, reliability, and maintainability.
Required Skills & Qualifications
- Hands-on experience with Cypress and JavaScript-based test automation.
- Strong understanding of QA methodologies, tools, and processes.
- Experience in testing web applications across multiple browsers and devices.
- Familiarity with REST APIs and tools like Postman or Swagger.
- Experience with version control systems like Git.
- Knowledge of CI/CD pipelines and integrating automated tests (e.g., GitHub Actions, Jenkins).
- Excellent analytical and problem-solving skills.
- Strong written and verbal communication.
Preferred Qualifications
- Experience with other automation tools (e.g., Selenium, Playwright) is a plus.
- Familiarity with performance testing or security testing.
- Background in Agile or Scrum methodologies.
- Basic understanding of DevOps practices.
About koolio.ai
Website: www.koolio.ai
koolio Inc. is a cutting-edge Silicon Valley startup dedicated to transforming how stories are told through audio. Our mission is to democratize audio content creation by empowering individuals and businesses to effortlessly produce high-quality, professional-grade content. Leveraging AI and intuitive web-based tools, koolio.ai enables creators to craft, edit, and distribute audio content—from storytelling to educational materials, brand marketing, and beyond—easily. We are passionate about helping people and organizations share their voices, fostering creativity, collaboration, and engaging storytelling for a wide range of use cases.
About the Full-Time Position
We are looking for a Junior QA Engineer (Fresher) to join our team on a full-time, hybrid basis. This is an exciting opportunity for a motivated fresher who is eager to learn and grow in the field of backend testing and quality assurance. You will work closely with senior engineers to ensure the reliability, performance, and scalability of koolio.ai’s backend services. This role is perfect for recent graduates who want to kickstart their career in a dynamic, innovative environment.
Key Responsibilities:
- Assist in the design and execution of test cases for backend services, APIs, and databases
- Perform manual and automated testing to validate the functionality and performance of backend systems
- Help identify, log, and track bugs, working closely with developers for issue resolution
- Contribute to developing automated test scripts to ensure continuous integration and deployment
- Document test cases, results, and issues in a clear and organized manner
- Continuously learn and apply testing methodologies and tools under the guidance of senior engineers
Requirements and Skills:
- Education: Degree in Computer Science or a related field
- Work Experience: No prior work experience required; internships or academic projects related to software testing or backend development are a plus
- Technical Skills:
- Basic understanding of backend systems and APIs
- Familiarity with SQL for basic database testing
- Exposure to any programming or scripting language (e.g., Python, JavaScript, Java)
- Interest in learning test automation tools and frameworks such as Selenium, JUnit, or Pytest
- Familiarity with basic version control systems (e.g., Git)
- Soft Skills:
- Eagerness to learn and apply new technologies in a fast-paced environment
- Strong analytical and problem-solving skills
- Excellent attention to detail and a proactive mindset
- Ability to communicate effectively and work in a collaborative, remote team
- Other Skills:
- Familiarity with API testing tools (e.g., Postman) or automation tools is a bonus but not mandatory
- Basic knowledge of testing methodologies and the software development life cycle is helpful
Compensation and Benefits:
- Total Yearly Compensation: ₹4.5-6 LPA based on skills and experience
- Health Insurance: Comprehensive health coverage provided by the company
Why Join Us?
- Be a part of a passionate and visionary team at the forefront of audio content creation
- Work on an exciting, evolving product that is reshaping the way audio content is created and consumed
- Thrive in a fast-moving, self-learning startup environment that values innovation, adaptability, and continuous improvement
- Enjoy the flexibility of a full-time hybrid position with opportunities to grow professionally and expand your skills
- Collaborate with talented professionals from around the world, contributing to a product that has a real-world impact
Key Responsibilities:
1. Threat Research: Work on researching emerging cyber threats specifically. You will monitor threat actor activities, study their tactics, techniques, and procedures (TTPs), and help identify potential risks.
2. Alert Triage and Incident Analysis: Support the analysis of security alerts generated by our in-house platform. You will work alongside the team to identify critical issues and provide timely
intelligence to help mitigate threats.
3. Data Collection and OSINT: Assist in gathering and analyzing data using Open Source Intelligence (OSINT) methodologies. You will help collect relevant information to support ongoing threat investigations.
4. Report Preparation: Contribute to the preparation of threat intelligence reports for internal and external stakeholders. You will learn how to convey complex technical information in a clear and
actionable manner.
5. SOP Development: Collaborate with the team to develop and refine Standard Operating Procedures (SOPs) for systematic threat analysis. Your input will help ensure that our procedures are efficient and scalable.
6. Cross-functional Collaboration: Work closely with various teams, including product development and data acquisition, to support the integration of new intelligence sources and improve the effectiveness of our threat intelligence platform.
Key Qualifications:
Educational Background: Completed a degree in Cybersecurity, Computer Science, Information Technology, or a related field.
Basic Knowledge of Cybersecurity: A foundational understanding of cybersecurity concepts, including web application security, threat analysis, and vulnerability assessment.
Familiarity with OSINT: Basic knowledge of Open Source Intelligence (OSINT) tools and methodologies for data collection.
Technical Skills: Familiarity with scripting languages such as Python, Ruby, or GO is a plus.
Experience with automation and data analysis tools will be advantageous.
Communication Skills: Strong written and verbal communication skills, with the ability to learn how to convey technical findings effectively.
Problem-Solving and Adaptability: A proactive attitude with strong problem-solving skills. You should be comfortable learning in a fast-paced and dynamic environment.
Additional Skills:
Interest in Cybersecurity Challenges: Participation in bug bounty programs, Capture The Flag (CTF) challenges, or cybersecurity competitions is a plus.
Willingness to Learn: A keen interest in developing skills in threat intelligence, threat actor profiling, and behavioral analysis.
|
Responsibilities: |
|
· Supports the security of next generation endpoint technology (Windows and Mac). |
|
· Assists with coordination and implementation of efforts with various IT teams to ensure solutions are fully tested and deployed enterprise wide. |
|
· Designs, builds, tests, secures, and documents endpoint security standards. |
|
· Participates on project teams as a resource representing Endpoint Engineering. |
|
· Manage an Engineering Team and provide support for users with more advanced security software issues with the highest service quality and customer satisfaction. |
|
· Ensures OS, security, and application lifecycle updates are consistently deployed on defined schedule. |
|
· Leverages/Consolidate SCCM Intune, and other data sources to build executive reports to measure compliance and provide dashboard and metrics to leadership. |
|
· Analyse security product issues and create workarounds along with self-healing automation for silent remediations. |
|
· Works closely with the Security Operations team to improve workflow and tool usage workflow for enhanced monitoring and response capabilities. |
|
· Evaluates risk, address security issues, facilitate and execute remediation activities across the organization. |
|
· Patriate in security assessment and validation of new technologies and major security changes. |
|
· Review security configurations and compliance on all endpoints technologies as well as assists with audits. |
|
· On-call for endpoint security incidents response and operational functions including triage, escalation, post-mortem, and remediation. |
|
· Uses data and operational metrics to analyse project and task results. Creates reports and charts to track progress and measures trends in the environment. Leverages data and reporting tools to make data-driven decisions about projects and tasks. |
|
· Engages with other IT support teams including Network Engineering, Enterprise Infrastructure, Service Desk, IT Security, and Technical Services to ensure a consistent approach for organizational support across the Enterprise. |
|
· Frontend new product POC and work closely with various stake holder including OEM. |
|
· Performs other duties and projects as assigned. |
|
Requirements: |
|
· Strong analytical, troubleshooting, and problem-solving skills. |
|
· Strong technical understanding of Endpoint Security in End User Computing. |
|
· Effective organizational and time management skills. |
|
· Effective team management skills. |
|
· Demonstrates knowledge of, adherence to, monitoring and responsibility for compliance with the international regulations and laws as they pertain to this position (Endpoint Security). |
|
· Strong understanding of past, current, and emerging malware and security exploits. |
|
· Minimum of fifteen (15) + years of comprehensive IT experience working with multiple operating systems, such as Windows desktop and server, Mac, Linux, etc. |
|
· Minimum of five (5) + years of with Cisco AMP or equivalent EDR/EPP/MDR/XDR tools. |
|
· Minimum of five (5) + years of with PMC or equivalent Endpoint Privilege Management tools. |
|
· Knowledge in Zero Trust Security framework. |
|
· Expert knowledge EPP, EDR, XDR, MDR, Privilege Management Active Directory, including Group Policy, DNS and organization unit design best practices. |
|
· Knowledge of SCCM, Azure Active Directory, and Intune. |
|
· Deep understanding of the Windows registry, Windows and macOS permissions, macOS preferences, and drivers and software required to build baseline images. |
|
· Knowledge of Mobile device management using Intune, JAMF. |
|
· Report writing expertise using Database Tools. |
|
· Advanced skills in creating SOP, SOW and Presentation decks using Microsoft Office applications including but not limited to Visio, Word, Excel, PowerPoint, and Outlook. |
|
· Exceptional verbal, written and interpersonal communication skills. |
|
· Ability to troubleshoot complex LAN/WAN issues related to connectivity, security, and physical location. |
|
· Ability to deliver stable and high-quality working solutions under deadlines in a fast-paced and dynamic environment. |
|
· Ability to make decisions that have significant impact on the enterprise. |
|
· Ability to manage ambiguity, operate effectively when things are not certain, or the way forward is not clear. |
|
· Ability to provide consultation and expert advice to management. |
|
· Ability to discuss emerging technologies such as cloud endpoint management. |
|
· Ability to make informal and formal presentations, inside and outside the organization; speaking before assigned team or other groups as needed. |
|
· Ability to deal with complex difficult problems involving multiple facets and variables in non-standardized situations. |
|
· Knowledge in securing/supporting/administering a virtual desktop environment running on VMware vSphere and Azure Virtual Desktop. |
|
· Experience creating deployment test plans, test cases, managing deployment groups, and soliciting feedback from pilot users. General understanding of software development, agile, CI/CD required. |
|
· Bachelor’s Degree in engineering or Cyber security preferred and a minimum of five (5) + years’ experience with security engineering. |
make an impact by enabling innovation and growth; someone with passion for what they do and a vision for the future.
Responsibilities:
- Be the analytical expert in Kaleidofin, managing ambiguous problems by using data to execute sophisticated quantitative modeling and deliver actionable insights.
- Develop comprehensive skills including project management, business judgment, analytical problem solving and technical depth.
- Become an expert on data and trends, both internal and external to Kaleidofin.
- Communicate key state of the business metrics and develop dashboards to enable teams to understand business metrics independently.
- Collaborate with stakeholders across teams to drive data analysis for key business questions, communicate insights and drive the planning process with company executives.
- Automate scheduling and distribution of reports and support auditing and value realization.
- Partner with enterprise architects to define and ensure proposed.
- Business Intelligence solutions adhere to an enterprise reference architecture.
- Design robust data-centric solutions and architecture that incorporates technology and strong BI solutions to scale up and eliminate repetitive tasks
Requirements:
- Experience leading development efforts through all phases of SDLC.
- 5+ years "hands-on" experience designing Analytics and Business Intelligence solutions.
- Experience with Quicksight, PowerBI, Tableau and Qlik is a plus.
- Hands on experience in SQL, data management, and scripting (preferably Python).
- Strong data visualisation design skills, data modeling and inference skills.
- Hands-on and experience in managing small teams.
- Financial services experience preferred, but not mandatory.
- Strong knowledge of architectural principles, tools, frameworks, and best practices.
- Excellent communication and presentation skills to communicate and collaborate with all levels of the organisation.
- Team handling preferred for 5+yrs experience candidates.
- Notice period less than 30 days.
- Must have experience in field sales
- Bike mandatory
- Immediate joiners
- External distribution
- Daily monitoring of leads update leads
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
Must Have:
- Strong proficiency with .Net Core API development
- Experience with caching approaches for API
- Hands on with SQL Server and its variation among popular databases
- Well versed with stored procedures, query plans, altering the indexes, and troubleshooting the performance holdups
- Skilled at performance optimization of .NET APIs
- Experience & Knowledge on Open API specification & swagger documentation
- Experience in building Cloud Native Applications
- Experience in Microservices architecture
- Has worked on the deployment of applications using Docker
- Has Knowledge of Kubernetes
- Has Experience with Cache systems like Redis.
Nice to Have:
- Familiar with .NET design patterns
- Performance tuning of SQL Stored Procedures
Note: We looking for immediate joiners. We expect the offered candidate should join within 15 days. Buyout reimbursement is available for 30 to 60 days notice period applicants who can ready join within 15 days.
Saas Experience is good to have.







