11+ Scientific Computing Jobs in India
Apply to 11+ Scientific Computing Jobs on CutShort.io. Find your next job, effortlessly. Browse Scientific Computing Jobs and apply today!
As a Lead Solutions Architect at Aganitha, you will:
* Engage and co-innovate with customers in BioPharma R&D
* Design and oversee implementation of solutions for BioPharma R&D * Manage Engineering teams using Agile methodologies
* Enhance reuse with platforms, frameworks and libraries
Applying candidates must have demonstrated expertise in the following areas:
1. App dev with modern tech stacks of Python, ReactJS, and fit for purpose database technologies
2. Big data engineering with distributed computing frameworks
3. Data modeling in scientific domains, preferably in one or more of: Genomics, Proteomics, Antibody engineering, Biological/Chemical synthesis and formulation, Clinical trials management
4. Cloud and DevOps automation
5. Machine learning and AI (Deep learning)
Assistant Sales Manager – Interior Design & Home Décor
Job Summary:
We are seeking a motivated and detail-oriented Assistant Sales Manager to support the sales team in driving business growth within the interior design and home décor industry. The role involves assisting in sales operations, client relationship management, and business development activities, while ensuring seamless coordination with internal teams to deliver premium customer experiences.
Key Responsibilities:
· Assist the Sales Manager in executing sales strategies to achieve revenue targets.
· Generate leads through networking, referrals, and market research.
· Build and maintain strong relationships with clients, architects, designers, and builders.
· Support in preparing proposals, quotations, and presentations tailored to client requirements.
· Coordinate with design, project, and operations teams to ensure timely execution of client orders.
· Handle client queries, follow-ups, and ensure high levels of customer satisfaction.
· Maintain accurate sales records, reports, and CRM data.
· Participate in exhibitions, trade fairs, and promotional activities to represent the company.
· Monitor market trends and competitor activities to provide inputs for business growth.
· Motivate and guide junior sales executives to achieve their targets.
Key Requirements:
· Bachelor’s degree in Marketing, Business Administration, or related field.
· 3–5 years of experience in sales/business development, preferably in interior design, home décor, furniture, or real estate sector.
· Strong communication, presentation, and interpersonal skills.
· Ability to build long-term relationships with clients and industry partners.
· Good negotiation skills with a client-centric approach.
· Proficiency in MS Office and CRM software.
· Ability to work independently as well as part of a team.
Key Competencies:
· Client relationship management
· Sales and negotiation skills
· Market awareness and research ability
· Team support and coordination
· Result orientation with attention to detail
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
What will you do?
- Oversees testing, deployment & maintenance of the Voice infrastructure.
- Performing a variety of tasks associated with user provisioning and call routing within the assigned platform
- Build loaders for internal systems ensuring tasks are completed accurately and in a timely manner
- Participate in meetings with clients and internal departments to understand the requirements of call routing
- Document and design call flow diagrams to ensure that all call variables have been accounted for in preparing a routing plan
- Support the clients solution by correcting any issues with call routing or provisioning during the initial implementation phase, constantly communicating status of any issues with client
- Complete testing to ensure that call flow is operating properly
- Monitors and ensures compliance to standards, policies, and procedures
What are we looking for?
- 0.6~2 year’s implementation experience in Asterisk PBX, IP Telephony, SIP.
- Able to thrive in a demanding and team-oriented environment
- Basic and strong knowledge of the Linux operating system.
- Knowledge of bash and perl shell scripting is an added advantage.
- RHCE Certification is an added advantage.
Job Description:
We are seeking a skilled Embedded Firmware Engineer with expertise in RTOS, embedded C/C++, Python, and IoT protocols. In this role, you will be responsible for developing and optimizing embedded firmware for 16-bit and 32-bit microcontrollers, focusing on board bring-up, testing, and debugging.
Key Responsibilities:
- Develop embedded firmware using embedded C/C++, Python, and data structures.
- Utilize RTOS, preferably Zephyr or FreeRTOS, for real-time embedded applications.
- Implement low-level embedded software design and development for microcontroller-based systems.
- Configure and integrate communication interfaces such as I2C, SPI, RS232/485, USB.
- Incorporate industrial protocols like Ethernet, Modbus, and REST into firmware designs.
- Hands-on experience with MQTT, HTTP, BLE, Wi-Fi, and web server technologies.
- Collaborate using GitHub and JIRA, following Agile/SAFe methodologies.
Required Skills:
- Proficient in embedded C/C++ programming and Python scripting.
- Strong understanding of RTOS, preferably with experience in Zephyr or FreeRTOS.
- Demonstrated knowledge of embedded firmware development for 16-bit / 32-bit microcontrollers (STM32, ESP32).
- Experience with communication interfaces including I2C, SPI, RS232/485, and USB.
- Familiarity with industrial protocols such as Ethernet, Modbus, and REST.
- Hands-on experience with IoT protocols including MQTT, HTTP, BLE, and Wi-Fi.
- Proficient use of GitHub and JIRA for version control and project management.
- Strong problem-solving skills and ability to troubleshoot embedded systems issues.
Qualifications:
- Bachelor's degree in Electrical Engineering, Computer Science, or related field.
- Proven experience in embedded firmware development and board bring-up.
- Excellent communication skills and ability to work in a collaborative team environment.
Join us in developing cutting-edge embedded firmware solutions for IoT applications and microcontroller-based systems!
Job Responsibilities
- Design, build & test ETL processes using Python & SQL for the corporate data warehouse
- Inform, influence, support, and execute our product decisions
- Maintain advertising data integrity by working closely with R&D to organize and store data in a format that provides accurate data and allows the business to quickly identify issues.
- Evaluate and prototype new technologies in the area of data processing
- Think quickly, communicate clearly and work collaboratively with product, data, engineering, QA and operations teams
- High energy level, strong team player and good work ethic
- Data analysis, understanding of business requirements and translation into logical pipelines & processes
- Identification, analysis & resolution of production & development bugs
- Support the release process including completing & reviewing documentation
- Configure data mappings & transformations to orchestrate data integration & validation
- Provide subject matter expertise
- Document solutions, tools & processes
- Create & support test plans with hands-on testing
- Peer reviews of work developed by other data engineers within the team
- Establish good working relationships & communication channels with relevant departments
Skills and Qualifications we look for
- University degree 2.1 or higher (or equivalent) in a relevant subject. Master’s degree in any data subject will be a strong advantage.
- 4 - 6 years experience with data engineering.
- Strong coding ability and software development experience in Python.
- Strong hands-on experience with SQL and Data Processing.
- Google cloud platform (Cloud composer, Dataflow, Cloud function, Bigquery, Cloud storage, dataproc)
- Good working experience in any one of the ETL tools (Airflow would be preferable).
- Should possess strong analytical and problem solving skills.
- Good to have skills - Apache pyspark, CircleCI, Terraform
- Motivated, self-directed, able to work with ambiguity and interested in emerging technologies, agile and collaborative processes.
- Understanding & experience of agile / scrum delivery methodology
- Gathering and evaluating user requirements in collaboration with product owners and engineers.
- Ensuring design consistency with the client’s development standards and guidelines.
- Ensuring high performance on mobile and desktop and fluency in mobile-based information architecture and design.
- Creating visually appealing experiences that feature user-friendly design and clear navigation.
- Guiding and maintaining developer teams and best practices.
What you need to have:
- Hands-on experience in building Web User Interface (UI) using HTML/HTML5, CSS, SCSS, Advance Javascript Libraries, Jquery, Bootstrap and UI frameworks like Node.js React, redux/angular.js/ backbone, Hapi.js.
- Experience in creating ‘Responsive’ and ‘Adaptive’ web-sites using HTML5 and CSS3 targeting desktop, tablet, and mobile devices.
- Proficiency with Angular 2/4/5/6/7/8
- For UI developer Job- must have good understanding of AJAX and JavaScript Dom manipulation Techniques
- Strong expertise with HTML, CSS, and writing cross-browser compatible code
- Experience in JavaScript build tools like grunt or gulp
- Expert in any one of the modern JavaScript MV-VM/MVC frameworks (AngularJS, JQuery, NodeJS, GruntJS, ReactJS)
- Strong understanding of front-end coding and development technologies.
- For UI developer job- must have the ability to provide SEO solutions for websites
- Experience with building the infrastructure for serving the front-end app and assets.
- Sound knowledge of IT concepts and the latest trends.
- Strong verbal and written communication and interpersonal skills
We are looking for a highly capable Node.js developer to optimize our web-based application performance. You will be collaborating with our front-end application developers, designing APIs, and integrating data storage and protection solutions.
Requirements
- Minimum of two years previous experience as a Node.js developer.
- Minimum of two years of experience developing applications using MongoDB.
- Minimum of one year of experience developing applications using ExpressJs.
- Extensive knowledge of JavaScript, web stacks, libraries, and frameworks.
- Exceptional analytical and problem-solving aptitude.
- Great organizational and time management skills.
- Availability to resolve urgent web application issues outside of business hours.
Responsibility
- Ensuring optimal performance of the central database and responsiveness to front-end requests.
- Collaborating with front-end developers on the integration of the APIs.
- Developing high-performance applications by writing testable, reusable, and efficient code.
- Implementing effective security protocols, data protection measures, and storage solutions.
- Running diagnostic tests and identifying bugs.
- Documenting Node.js processes, including database schemas.
- Recommending and implementing improvements to APIs.
Role: Blockchain Developer with .Net
Roles and Responsibilities:
- Must be proficient in developing object-oriented applications
- Must have hands on experience on C# & WebAPI.
- Participate in troubleshooting and training of issues with different teams to drive towards root cause identification and resolution.
- Participate in technical communication within the team and to other groups, prepare technical design documents.
- Must have a basic hands on experience on Ethereum Blockchain and Solidity
- Good to have exposure to Azure.
- Positive attitude, Smart work, problem solving attitude, ready to upgrade skills with latest technologies
- Developing technical specifications.
- Drafting software and application operating procedures.
- Training junior staff.
- Good verbal and written communication skills
Required Blockchain(Ethereum) Experience: 3 to 6+ Months
Required C#, .Net Experience: 2+ year
Overall Experience: 2 – 3 years
Industry: https://www.naukri.com/it-software-jobs">IT-Software / https://www.naukri.com/software-services-jobs">Software Services
Functional Area: https://www.naukri.com/it-software-qa-&-testing-jobs">IT Software – Development and Maintenance
Role Category: Programming & Design
Employment Type: Permanent Job, Full Time






