
About Tache Technologies Pvt Ltd
About
Connect with the team
Similar jobs
Data Scientist
Job Id: QX003
About Us:
QX impact was launched with a mission to make AI accessible and affordable and deliver AI Products/Solutions at scale for enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights, businesses will continue to face challenges to better understand their customers and even lose them; Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Position Overview:
We are seeking a collaborative and analytical Data Scientist who can bridge the gap between business needs and data science capabilities. In this role, you will lead and support projects that apply machine learning, AI, and statistical modeling to generate actionable insights and drive business value.
Key Responsibilities:
- Collaborate with stakeholders to define and translate business challenges into data science solutions.
- Conduct in-depth data analysis on structured and unstructured datasets.
- Build, validate, and deploy machine learning models to solve real-world problems.
- Develop clear visualizations and presentations to communicate insights.
- Drive end-to-end project delivery, from exploration to production.
- Contribute to team knowledge sharing and mentorship activities.
Must-Have Skills:
- 3+ years of progressive experience in data science, applied analytics, or a related quantitative role, demonstrating a proven track record of delivering impactful data-driven solutions.
- Exceptional programming proficiency in Python, including extensive experience with core libraries such as Pandas, NumPy, Scikit-learn, NLTK and XGBoost.
- Expert-level SQL skills for complex data extraction, transformation, and analysis from various relational databases.
- Deep understanding and practical application of statistical modeling and machine learning techniques, including but not limited to regression, classification, clustering, time series analysis, and dimensionality reduction.
- Proven expertise in end-to-end machine learning model development lifecycle, including robust feature engineering, rigorous model validation and evaluation (e.g., A/B testing), and model deployment strategies.
- Demonstrated ability to translate complex business problems into actionable analytical frameworks and data science solutions, driving measurable business outcomes.
- Proficiency in advanced data analysis techniques, including Exploratory Data Analysis (EDA), customer segmentation (e.g., RFM analysis), and cohort analysis, to uncover actionable insights.
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
- Experience in designing and implementing data models, including logical and physical data modeling, and developing source-to-target mappings for robust data pipelines.
- Exceptional communication skills, with the ability to clearly articulate complex technical findings, methodologies, and recommendations to diverse business stakeholders (both technical and non-technical audiences).
Good-to-Have Skills:
- Experience with cloud platforms (Azure, AWS, GCP) and specific services like Azure ML, Synapse, Azure Kubernetes and Databricks.
- Familiarity with big data processing tools like Apache Spark or Hadoop.
- Exposure to MLOps tools and practices (e.g., MLflow, Docker, Kubeflow) for model lifecycle management.
- Knowledge of deep learning libraries (TensorFlow, PyTorch) or experience with Generative AI (GenAI) and Large Language Models (LLMs).
- Proficiency with business intelligence and data visualization tools such as Tableau, Power BI, or Plotly.
- Experience working within Agile project delivery methodologies.
Competencies:
· Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.
· Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.
· Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.
· Customer Focus - Building strong customer relationships and delivering customer-centric solutions.
· Optimizes Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
About the Company
An Online platform for creators, influencers and celebrities to grow, manage and monetise their community. It is aimed at both established and aspiring online creators who want to pursue their passion. Our vision is to enable anyone and everyone to make a successful living doing what they enjoy.
Skills And Qualification: -
• Must have experience with any scripting language (JavaScript/NodeJS preferred)
• Experience with API-driven and highly scalable application is a plus
• Good knowledge of non-relational DB (MongoDB preferred)
• Good with data structures and algorithms
• Good to have an experience with testing framework and CI/CD pipeline
• 1 years of experience in software development preferably B2C experience.
• Basic understanding of Frontend and client-side frameworks like React, Angular, Vue will be plus.
• Experience in building scalable Restful APIs & Services
Intuitive cloud (http://www.intuitive.cloud">www.intuitive.cloud) is one of the fastest growing top-tier Cloud Solutions and SDx Engineering solution and service company supporting 80+ Global Enterprise Customer across Americas, Europe and Middle East.
Intuitive is a recognized professional and manage service partner for core superpowers in cloud(public/ Hybrid), security, GRC, DevSecOps, SRE, Application modernization/ containers/ K8 -as-a- service and cloud application delivery
Experience managing, supporting and deploying network infrastructures
Strong ability to diagnose server or network alerts, events or issues
Understanding of common information architecture framework
Perform proactive monitoring and alerting with using ticketing platform and reporting
Ability to perform software upgrades, server and 3rd party application patches, etc.
Perform health-check and proactive maintenance of all the infrastructure devices
Good time management and organizational skills
Ability to handle multiple concurrent tasks and projects with minimal supervision
Knowledge of project management methodologies and techniques
Good verbal as well as written communication skills
Ability to work in a flexible schedule for Rotational 12hrs shift (as per roaster)
Previous customer service or helpdesk experience preferred
The desired candidate is required to build and maintain scalable mobile applications.
- Should adhere to SDLC processes and Oracle standard documentation /Operating Procedures
- Develop Reports, Charts and Forms using built-in UI components
- Add HTML5 and CSS3 scripting to improve the User Interface thereby providing a better user experience
- Include custom scripting (JavaScript/JQuery/AJAX) as per the business requirement
- Prepare Test cases/reports. Perform Unit , Peer and System Testing on the same
- Inspect / Debug the code logic using the browser developer tools
- Deploy Authentication and Authorization for the modules as per requirement
- Build features and applications with a mobile responsive design (Not mandatory)
- Migrate application and DB objects from DEV to TEST/PROD environments.
- Mandatory to work from Bosch Bangalore office from the start
Mandatory and Optional Skills
Sr. No.
Technical Skills
Mandatory /Good to have
Min Years of hands on Exp in this skill
1
Oracle Apex
Mandatory
3 Yrs
2
Oracle SQL, PL/SQL, Oracle DB Models/Objects
Mandatory
3-5 Yrs
3
Extensive project experience in developing Interactive Reports/Grids, Charts/Graphs, Forms (version > 20.x)
Mandatory
3
4
JavaScript, JQuery, AJAX, CSS and HTML5
Knowledge
1-2 Yrs
5
Data and UI integration with external (Oracle & Non-Oracle) systems
Knowledge
3 Yrs
- Should adhere to SDLC processes and Oracle standard documentation /Operating Procedures
- Develop Reports, Charts and Forms using built-in UI components
- Add HTML5 and CSS3 scripting to improve the User Interface thereby providing a better user experience
- Include custom scripting (JavaScript/JQuery/AJAX) as per the business requirement
- Prepare Test cases/reports. Perform Unit , Peer and System Testing on the same
- Inspect / Debug the code logic using the browser developer tools
- Deploy Authentication and Authorization for the modules as per requirement
- Build features and applications with a mobile responsive design (Not mandatory)
- Migrate application and DB objects from DEV to TEST/PROD environments.
- Mandatory to work from Bosch Bangalore office from the start
Mandatory and Optional Skills
Sr. No.
Technical Skills
Mandatory /Good to have
Min Years of hands on Exp in this skill
1
Oracle Apex
Mandatory
3 Yrs
2
Oracle SQL, PL/SQL, Oracle DB Models/Objects
Mandatory
3-5 Yrs
3
Extensive project experience in developing Interactive Reports/Grids, Charts/Graphs, Forms (version > 20.x)
Mandatory
3
4
JavaScript, JQuery, AJAX, CSS and HTML5
Knowledge
1-2 Yrs
5
Data and UI integration with external (Oracle & Non-Oracle) systems
Knowledge
3 Yrs
Greetings! We are looking for Product Manager for our Data modernization product. We need a resource with good knowledge on Big Data/DWH. should have strong Stakeholders management and Presentation skills
BRIEF DESCRIPTION:
At-least 1 year of Python, Spark, SQL, data engineering experience
Primary Skillset: PySpark, Scala/Python/Spark, Azure Synapse, S3, RedShift/Snowflake
Relevant Experience: Legacy ETL job Migration to AWS Glue / Python & Spark combination
ROLE SCOPE:
Reverse engineer the existing/legacy ETL jobs
Create the workflow diagrams and review the logic diagrams with Tech Leads
Write equivalent logic in Python & Spark
Unit test the Glue jobs and certify the data loads before passing to system testing
Follow the best practices, enable appropriate audit & control mechanism
Analytically skillful, identify the root causes quickly and efficiently debug issues
Take ownership of the deliverables and support the deployments
REQUIREMENTS:
Create data pipelines for data integration into Cloud stacks eg. Azure Synapse
Code data processing jobs in Azure Synapse Analytics, Python, and Spark
Experience in dealing with structured, semi-structured, and unstructured data in batch and real-time environments.
Should be able to process .json, .parquet and .avro files
PREFERRED BACKGROUND:
Tier1/2 candidates from IIT/NIT/IIITs
However, relevant experience, learning attitude takes precedence
Designation:Linux/System/Support Engineer (L2) Experience : 2-5 Yrs
Notice period : immediate to 30 Days
- Server Monitoring
- Deployments
- Collecting information about the reported issues
- Ensuring whether all the information has been logged in the ticketing system or not
- Must be able to follow and execute instructions specified in user guides, emails to run, monitor and trouble shoot
- Must be able and willing to document activities, procedures
- Must have trouble shooting skills and have knowledge on Antivirus, Firewall, Gateway
- Should be ready to work for extended shifts, if
- Good customer management skills bundled with good communication skills
- Databases: concepts and ability to use DB tools such as psql,
- Good Understanding of Oracle, weblogic, Linux/ Unix Terminology and able to execute commands
- Internet Technologies : Tomcat/ apache concepts, basic html, etc
- Able to use MS- Excel, Power Point

|
Resource Should have At least 3+ Year Experience and have worked in NEFT & RTGS ( Specially SFMS v6.1) |
|
Resource Should know Troubleshooting of Finding Messages for NEFT,RTGS, LC & BG. |
|
Application Start and Stop Knowledge |
|
Basic Database Knowledge i.e., Login and Executing Command including Data pump. |
|
RTGS Message Types |
|
NEFT message Types |
|
NEFT & RTGS Flow |
|
F Series Messages |
|
SFMS Login, Report generation, Message Findings etc. |
|
Basic MQ Knowledge |
|
Certificate Request Generation for RTGS & NEFT. |
|
Requirements of NEFT & RTGS. |
|
Installation of SFMS & its Patches including Configuring Client Settings |
|
Resource should well aware about NEFT & RTGS Systems i.e. SFMS. |










