11+ SOW Jobs in Chennai | SOW Job openings in Chennai
Apply to 11+ SOW Jobs in Chennai on CutShort.io. Explore the latest SOW Job opportunities across top companies like Google, Amazon & Adobe.
Role Description
This is a full-time on-site role for a Business Operations Associate at DBaaS Software Private Limited in Chennai. The Business Operations Coordinator will be responsible for day-to-day operations, providing analytical support, assisting with administrative tasks, and delivering excellent customer service.
Qualifications
Strong business operations and analytical skills
Experience in providing administrative assistance
Excellent communication skills
Customer service-oriented mindset
Ability to work collaboratively in a team environment
Attention to detail and problem-solving abilities
Proficiency in Microsoft Office Suite
Experience in the software development or digital marketing industry is a plus
Bachelor's degree in Business Administration or related field
Develop a strategy to identify and evaluate tender opportunities aligned with the company's objectives and capabilities
Coordinate with various departments to gather the necessary information and documentation for the submission of tenders
Develop and submit compelling tender responses, ensuring that deadlines and project requirements are met
Communicate with clients and respond to queries during the tendering process
Analyze the tender results and provide feedback to the management for continuous improvement
Maintain close contact with project managers to ensure smooth project initiation, execution, and closure
Contribute to the definition of the scope, objectives, and deliverables of the project
Monitor the progress of the project and provide management and stakeholders with updates
Identification and mitigation of project risks and issues, and escalation thereof as necessary
Review and negotiate contract terms and conditions to ensure alignment with company goals and policies
Monitor contract performance and ensure contractual compliance
Responsible for amending, extending, and renewing contracts
Maintain accurate and up-to-date records of all contracts and related documents.
Process Improvement:
Analyze existing business processes related to tenders and projects, identifying areas for improvement
Enhance process efficiency, productivity, and quality through the development and implementation of process enhancements
Establish and maintain standard operating procedures (SOPs) for the management of tenders and projects
Role: Mainframe Developer
Skill Set: Mainframe, COBOL, JCL, DB2
Years of exp: 5 to 12 yrs
Location: Pune , Chennai
Mode of work: WFO
Job Description:
A mid-senior level Mainframe lead with 4 to 8 years of hands on experienced on COBOL Programming, JCL and DB2 Coding and processing, to deliver a critical project for one of our biggest clients in banking domain. The Individual should be passionate about technology, experienced in developing and managing cutting edge technology applications.
Technical Skills:
- An excellent techie with strong hands-on experience in COBOL, JCL and DB2 knowledge.
- Preferably, good exposure to Mainframe to distributed migration project.
- A master of DB2
- Should have good analytical and development skill on Cobol programming and JCL
- Capable of analyzing requirements and develop software as per project defined software process
- Develop and peer review of LLD (Initiate/ participate in peer reviews)
- Should have good writing and verbal communication skills
Role Overview:
We are seeking a highly skilled and motivated Data Scientist to join our growing team. The ideal candidate will be responsible for developing and deploying machine learning models from scratch to production level, focusing on building robust data-driven products. You will work closely with software engineers, product managers, and other stakeholders to ensure our AI-driven solutions meet the needs of our users and align with the company's strategic goals.
Key Responsibilities:
- Develop, implement, and optimize machine learning models and algorithms to support product development.
- Work on the end-to-end lifecycle of data science projects, including data collection, preprocessing, model training, evaluation, and deployment.
- Collaborate with cross-functional teams to define data requirements and product taxonomy.
- Design and build scalable data pipelines and systems to support real-time data processing and analysis.
- Ensure the accuracy and quality of data used for modeling and analytics.
- Monitor and evaluate the performance of deployed models, making necessary adjustments to maintain optimal results.
- Implement best practices for data governance, privacy, and security.
- Document processes, methodologies, and technical solutions to maintain transparency and reproducibility.
Qualifications:
- Bachelor's or Master's degree in Data Science, Computer Science, Engineering, or a related field.
- 5+ years of experience in data science, machine learning, or a related field, with a track record of developing and deploying products from scratch to production.
- Strong programming skills in Python and experience with data analysis and machine learning libraries (e.g., Pandas, NumPy, TensorFlow, PyTorch).
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and containerization technologies (e.g., Docker).
- Proficiency in building and optimizing data pipelines, ETL processes, and data storage solutions.
- Hands-on experience with data visualization tools and techniques.
- Strong understanding of statistics, data analysis, and machine learning concepts.
- Excellent problem-solving skills and attention to detail.
- Ability to work collaboratively in a fast-paced, dynamic environment.
Preferred Qualifications:
- Knowledge of microservices architecture and RESTful APIs.
- Familiarity with Agile development methodologies.
- Experience in building taxonomy for data products.
- Strong communication skills and the ability to explain complex technical concepts to non-technical stakeholders.
• Charting learning journeys with knowledge graphs.
• Predicting memory decay based upon an advanced cognitive model.
• Ensure content quality via study behavior anomaly detection.
• Recommend tags using NLP for complex knowledge.
• Auto-associate concept maps from loosely structured data.
• Predict knowledge mastery.
• Search query personalization.
Requirements:
• 6+ years experience in AI/ML with end-to-end implementation.
• Excellent communication and interpersonal skills.
• Expertise in SageMaker, TensorFlow, MXNet, or equivalent.
• Expertise with databases (e. g. NoSQL, Graph).
• Expertise with backend engineering (e. g. AWS Lambda, Node.js ).
• Passionate about solving problems in education
Experience: 3 – 5 Years
React Native framwork app developement for IOS and android.
Build app and UI components from prototypes and wireframes
Working with Redux architecture.
Expo tool for development and app release ( Advantage)
APK and IPA generation using expo tool.
CaratLane is a Technology-Driven organization and India’s first omnichannel jewelry brand, it
was founded in 2008, by Mithun Sacheti, with a simple but courageous objective – to make
beautiful jewelry accessible, affordable and forever wearable. With a strategic investment from
Titan Company Limited, CaratLane is now partnered with India’s largest retail jeweler Tanishq.
Under the leadership of our co-founders Gurukeerthi Gurunathan and Avnish Anand, CaratLane
aims to work towards a common mission – to offer customers beautiful jewelry and a
distinctive shopping experience that fits today’s values and lifestyles.
Desired candidate profile :
● Development experience with Android OS and Knowledge of Web services/API
interactions, Audio/Video streaming, SQLite, and JSON/XML parsing.
● Understanding of Object Oriented Programming and good command over an object-oriented language
● Strong grasp of algorithms and data structures.
● Hands-on Linux experience
● Experience launching mobile applications in the Google Play app store or Apple iTunes
store.
● Experience launching web applications.
● Experience with an MVC framework like Ruby on Rails, Django, Laravel
● Experience with server-side and client-side Javascript - Node.js, Angular.js, jQuery.
● Experience with version control tools like git, Subversion and Mercurial
● Experience with Amazon Web Services
● Experience with Distributed systems and machine learning
● Knowledge of design patterns
● Ability to write and maintain high-quality code.
● Must be a quick learner and be adaptable to new technologies.
● Comfortable in an agile start-up culture based on respect, low hierarchy, high
transparency and fast sprints.
Nice to Have :
● Domain knowledge in eCommerce
● Previous experience in a product company is a plus.
What we value as a team:
● Proactive in communication
● Collaborate with other members of the agile ecosystem
● Out-of-the-box thinking to resolve issues and bringing New Ideas to bring Quality in
the Applications
Job Description:
Responsibilities:
- Backlog ownership: Work with the development pod as Product Owner
- Execution: Providing vision and direction to the Agile development team, and other stakeholders and creating / managing requirements
- Prioritization: Plan and prioritize the product feature backlog and development for the product
- Flow: Ensure that the team always has an adequate amount of tasks to work on
- Collaboration: Work with Development Managers, Solution Architect, Delivery, and the development team(s) on estimation in order to deliver the project or program
- Technical: Work with Solution Architects to define the technical solutions and strategy for the product features
- Quality: Work with QAs to understand, document, triage and fix live issues or bugs
- Planning: Provide backlog management, iteration planning, and elaboration of the user stories
- Expertise: Gain and maintain a deep understanding of cloud governance-based business processes and workload deployments providing focus and prioritisation to product development processes
- Product Management: Contribute to product strategy, proposition development, prioritisation and release planning
Minimum Requirements:
- 5+ years of Product ownership or related experience, with a significant part of it working on Cloud and SaaS technologies
- Outstanding collaboration, communication, listening, presentation and leadership skills
- Comfortable being hands-on in a fast-paced, ambiguous and rapidly changing environment
- Strong champion of Agile product development and tooling
- Exceptional road mapping skills and product domain knowledge
- Deep analytical, problem solving and communication skills
- Highly organised with great project management skills
- Numerate and data driven decision-maker
- Self-starter, able to work with minimal guidance and adapt to changing requirements
Desired:
- Experience of different project management tools and methodologies
- Interest in new technologies and development processes
- Interest in SaaS & Cloud computing
- AWS, Azure or GCP certification preferred.
We are looking for an outstanding ML Architect (Deployments) with expertise in deploying Machine Learning solutions/models into production and scaling them to serve millions of customers. A candidate with an adaptable and productive working style which fits in a fast-moving environment.
Skills:
- 5+ years deploying Machine Learning pipelines in large enterprise production systems.
- Experience developing end to end ML solutions from business hypothesis to deployment / understanding the entirety of the ML development life cycle.
- Expert in modern software development practices; solid experience using source control management (CI/CD).
- Proficient in designing relevant architecture / microservices to fulfil application integration, model monitoring, training / re-training, model management, model deployment, model experimentation/development, alert mechanisms.
- Experience with public cloud platforms (Azure, AWS, GCP).
- Serverless services like lambda, azure functions, and/or cloud functions.
- Orchestration services like data factory, data pipeline, and/or data flow.
- Data science workbench/managed services like azure machine learning, sagemaker, and/or AI platform.
- Data warehouse services like snowflake, redshift, bigquery, azure sql dw, AWS Redshift.
- Distributed computing services like Pyspark, EMR, Databricks.
- Data storage services like cloud storage, S3, blob, S3 Glacier.
- Data visualization tools like Power BI, Tableau, Quicksight, and/or Qlik.
- Proven experience serving up predictive algorithms and analytics through batch and real-time APIs.
- Solid working experience with software engineers, data scientists, product owners, business analysts, project managers, and business stakeholders to design the holistic solution.
- Strong technical acumen around automated testing.
- Extensive background in statistical analysis and modeling (distributions, hypothesis testing, probability theory, etc.)
- Strong hands-on experience with statistical packages and ML libraries (e.g., Python scikit learn, Spark MLlib, etc.)
- Experience in effective data exploration and visualization (e.g., Excel, Power BI, Tableau, Qlik, etc.)
- Experience in developing and debugging in one or more of the languages Java, Python.
- Ability to work in cross functional teams.
- Apply Machine Learning techniques in production including, but not limited to, neuralnets, regression, decision trees, random forests, ensembles, SVM, Bayesian models, K-Means, etc.
Roles and Responsibilities:
Deploying ML models into production, and scaling them to serve millions of customers.
Technical solutioning skills with deep understanding of technical API integrations, AI / Data Science, BigData and public cloud architectures / deployments in a SaaS environment.
Strong stakeholder relationship management skills - able to influence and manage the expectations of senior executives.
Strong networking skills with the ability to build and maintain strong relationships with both business, operations and technology teams internally and externally.
Provide software design and programming support to projects.
Qualifications & Experience:
Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Machine Learning Architect (Deployments) or a similar role for 5-7 years.





