11+ Theoretical physics Jobs in India
Apply to 11+ Theoretical physics Jobs on CutShort.io. Find your next job, effortlessly. Browse Theoretical physics Jobs and apply today!
This positions is based in San Sebastián Spain
Multiverse is a well-funded deep-tech company. We are one of the few companies working with Quantum Computing. We provide hyper-efficient software to companies wanting to gain an edge with quantum computing and artificial intelligence. We work on finance, energy, manufacturing, cybersecurity and many more industries. You will be working alongside world leading experts to build solutions that tackle real life issues.We are proud to grow on an ethics driven environment, promoting sustainability and diversity. We aim to continue building our truly inclusive culture - come and join us!
As a Quantum Computing Engineer, you will
- Join a world-class team of Quantum experts with an extensive track record in both academia and industry.
- Work with Fortune-500 customers from government and private sectors.
- Collaborate with the founding team in a fast paced startup environment.
- Contribute to designing and improving Multiverse’s codebase.
- Design and implement new quantum algorithms.
Required Qualifications
- 3 years experience in similar positions.
- Masters degree in Quantum Physics, Quantum Engineering, or related field.
- Knowledge of quantum computing algorithms, particularly gate model and hybrid techniques.
- Advanced knowledge of Python and related scientific computing tools.
- Experience with quantum computing SDKs (Qiskit, Pennylane, or others).
- Ability to work with a diverse team of people, both in-person and through online resources.
- Perfect command of English language
Preferred Qualifications
- PhD in Quantum Physics, Computer Science, or related field.
- Industry experience.
- Experience using git, unit testing, and docker.
Exceptional Qualifications
- Experience with error mitigation techniques and tools.
- Ability to write patent applications.
- Ability to write grant proposals.
Perks & Benefits
● Potential to grow into long-term opportunities.
● Working in a high paced environment, working on cutting edge technologies.
● Opportunity to learn and teach.
● A mix of remote and office work – A minimum of 3 days office work is required.
● Relocation package (minimum distance apply).
● Private health insurance.
● Progressive Company. Happy people culture.
● Competitive salary.
is a software product company that provides
5+ years of experience designing, developing, validating, and automating ETL processes 3+ years of experience traditional ETL tools such as Visual Studio, SQL Server Management Studio, SSIS, SSAS and SSRS 2+ years of experience with cloud technologies and platforms, such as: Kubernetes, Spark, Kafka, Azure Data Factory, Snowflake, ML Flow, Databricks, Airflow or similar Must have experience with designing and implementing data access layers Must be an expert with SQL/T-SQL and Python Must have experience in Kafka Define and implement data models with various database technologies like MongoDB, CosmosDB, Neo4j, MariaDB and SQL Serve Ingest and publish data from sources and to destinations via an API Exposure to ETL/ELT with using Kafka or Azure Event Hubs with Spark or Databricks is a plus Exposure to healthcare technologies and integrations for FHIR API, HL7 or other HIE protocols is a plus
Skills Required :
Designing, Developing, ETL, Visual Studio, Python, Spark, Kubernetes, Kafka, Azure Data Factory, SQL Server, Airflow, Databricks, T-SQL, MongoDB, CosmosDB, Snowflake, SSIS, SSAS, SSRS, FHIR API, HL7, HIE Protocols
- Bachelor’s degree preferably in Engineering or equivalent professional or military experience with 10-15 years of experience.
- 5+ years of large-scale software development or application engineering with recent coding experience in two or more modern programming languages such as:Java,JavaScript, C/C++, C#, Swift, Node.js, Python, Go, or Ruby
- Experience with Continuous Integration and Continuous Delivery (CI/CD)
- Helping customers architect scalable, highly available application solutions that leverage at least 2 cloud environments out of AWS, GCP, Azure.
- Architecting and developing customer applications to be cloud developed or re-engineered or optimized
- Working as a technical leader alongside customer business, development and Development teams with support to Infrastructure team
- Providing deep software development knowledge with respect cloud architecture,design patterns and programming
- Advising and implementing Cloud (AWS/GCP/Azure) best practices
- Working as both an application architect as well as development specialist in Cloud native Apps architecture, development to deployment phases.
- Implementing DevOps practices such as infrastructure as code, continuous integration and automated deployment
Job Description:
As a developer in the chatbot team you’ll be responsible for expanding the scope of AERIN (Aertrip’s Chatbot) to handle all types of travel queries in a scalable and efficient manner. You will need to make AERIN as robust as possible and strive towards 100% accuracy. You should be able to design, develop, test and roll out new features quickly. You should be ready to tackle open-ended problems and build solutions for them.
Preferred qualifications & experience:
- Bachelor’s degree in Computer Science or equivalent experience.
- 1 or more years of work experience in the tech industry.
- Experience with NLP solutions.
- Experience with chatbot frameworks like RASA, DialogFlow is a bonus.
- Experience with spacy, nltk and other ML libraries.
- Familiarity with version control tools like Gitlab.
- Experience in object oriented design and programming.
- Sound programming skills to write algorithms for text analysis and context understanding.
Preferred Skills & Abilities:
- Should be able to write clean and manageable code.
- Passion for innovation.
- Out of the box thinking.
- Should be able to understand both the tech and business aspects of a system.
- Should be able to meet deadlines.
- Should be able to work in a collaborative environment.
- Should be able to work independently in small teams.
- 3to 4years of professional experience as a DevOps / System Engineer
- Command line experience with Linux including writing bash scripts
- Programming in Python, Java or similar
- Fluent in Python and Python testing best practices
- Extensive experience working within AWS and with it’s managed products (EC2, ECS, ECR,R53,SES, Elasticache, RDS,VPCs etc)
- Strong experience with containers (Docker, Compose, ECS)
- Version control system experience (e.g. Git)
- Networking fundamentals
- Ability to learn and apply new technologies through self-learning
Responsibilities
- As part of a team implement DevOps infrastructure projects
- Design and implement secure automation solutions for development, testing, and productionenvironments
- Build and deploy automation, monitoring, and analysis solutions
- Manage our continuous integration and delivery pipeline to maximize efficiency
- Implement industry best practices for system hardening and configuration management
- Secure, scale, and manage Linux virtual environments
- Develop and maintain solutions for operational administration, system/data backup, disasterrecovery, and security/performance monitoring
Velankani Communications Technology
Job Description
Immediate joiner/ 30 days Notice Period
Location: Bangalore
Academics:
Primary Skills:
- Candidate should have 4-5 years of development experience with hands-on work on Linux kernel programming skills and device driver development in the products like SD-WAN, ADS/ADC,
- Advanced firewall, WAN Optimization, etc.
Experience:
- Experienced in System Programming and developing multi-threaded high performance networking products.
- Strong in Linux kernel programming and device driver development.
- Knowledge of DPDK, VLAN, Trunking, WiFi 802.11 standards, and LTE.
Preferred:
- Experience in hypervisors and cloud platforms
- Strong Knowledge in the networking domain (L2/L3/L4)
Languages:
- C, C++
- Python(Good to have)
Primary Skills:
C, C++, Linux kernel programming, Device Driver, VLAN, DPDK.
Secondary Skill: Data Structures, Cloud, Python, Networking.
at Synapsica Technologies Pvt Ltd
Introduction
http://www.synapsica.com/">Synapsica is a https://yourstory.com/2021/06/funding-alert-synapsica-healthcare-ivycap-ventures-endiya-partners/">series-A funded HealthTech startup founded by alumni from IIT Kharagpur, AIIMS New Delhi, and IIM Ahmedabad. We believe healthcare needs to be transparent and objective while being affordable. Every patient has the right to know exactly what is happening in their bodies and they don't have to rely on cryptic 2 liners given to them as a diagnosis.
Towards this aim, we are building an artificial intelligence enabled cloud based platform to analyse medical images and create v2.0 of advanced radiology reporting. We are backed by IvyCap, Endia Partners, YCombinator and other investors from India, US, and Japan. We are proud to have GE and The Spinal Kinetics as our partners. Here’s a small sample of what we’re building: https://www.youtube.com/watch?v=FR6a94Tqqls">https://www.youtube.com/watch?v=FR6a94Tqqls
Your Roles and Responsibilities
Synapsica is looking for a Principal AI Researcher to lead and drive AI based research and development efforts. Ideal candidate should have extensive experience in Computer Vision and AI Research, either through studies or industrial R&D projects and should be excited to work on advanced exploratory research and development projects in computer vision and machine learning to create the next generation of advanced radiology solutions.
The role involves computer vision tasks including development customization and training of Convolutional Neural Networks (CNNs); application of ML techniques (SVM, regression, clustering etc.), and traditional Image Processing (OpenCV, etc.). The role is research-focused and would involve going through and implementing existing research papers, deep dive of problem analysis, frequent review of results, generating new ideas, building new models from scratch, publishing papers, automating and optimizing key processes. The role will span from real-world data handling to the most advanced methods such as transfer learning, generative models, reinforcement learning, etc., with a focus on understanding quickly and experimenting even faster. Suitable candidate will collaborate closely both with the medical research team, software developers and AI research scientists. The candidate must be creative, ask questions, and be comfortable challenging the status quo. The position is based in our Bangalore office.
Primary Responsibilities
- Interface between product managers and engineers to design, build, and deliver AI models and capabilities for our spine products.
- Formulate and design AI capabilities of our stack with special focus on computer vision.
- Strategize end-to-end model training flow including data annotation, model experiments, model optimizations, model deployment and relevant automations
- Lead teams, engineers, and scientists to envision and build new research capabilities and ensure delivery of our product roadmap.
- Organize regular reviews and discussions.
- Keep the team up-to-date with latest industrial and research updates.
- Publish research and clinical validation papers
Requirements
- 6+ years of relevant experience in solving complex real-world problems at scale using computer vision-based deep learning.
- Prior experience in leading and managing a team.
- Strong problem-solving ability
- Prior experience with Python, cuDNN, Tensorflow, PyTorch, Keras, Caffe (or similar Deep Learning frameworks).
- Extensive understanding of computer vision/image processing applications like object classification, segmentation, object detection etc
- Ability to write custom Convolutional Neural Network Architecture in Pytorch (or similar)
- Background in publishing research papers and/or patents
- Computer Vision and AI Research background in medical domain will be a plus
- Experience of GPU/DSP/other Multi-core architecture programming
- Effective communication with other project members and project stakeholders
- Detail-oriented, eager to learn, acquire new skills
- Prior Project Management and Team Leadership experience
- Ability to plan work and meet the deadline
Robotics Startup builds for logistics and manufacturing.(B1)
- Design & Development of the architecture for multi-robot planning.
- Design & development of task allocation algorithms
- Design & development of conflict resolution approaches.
- Design & develop queuing strategies for multi-robot deployments.
- Research & collaboration on approaches to improve task allocation based on historical data
- Design & development of communication architecture for inter-robot and robot-server communications.
Requirements - B.Tech, M. Tech or higher qualification in Computer Science Engineering, Information
- Technology or related fields
- Proficiency with C++/Python programming language
- Experience of working in the robotics field for 3-5 years
- Skilled at general software development, bug analysis, and fixing
- Knowledge of networking/communication concepts
- Strong knowledge of Robot Operating System (ROS)
- Good knowledge of system design
- Excellent problem-solving skills
- Good project management skills
- Excellent verbal communication skills
- Good interpersonal skill
We are looking for an automation specialist who will play a key role in Sattva's digitisation
initiatives. Our rapid growth in the last year has underscored the importance of
technology-driven solutions to manage business processes at scale.
Currently our tech landscape is a collection of best-of-breed SaaS solutions that need to be
integrated/extended based on business needs. This role involves identifying automation
opportunities and realising them through low/no-code platforms like AppSheet, Zapier, etc. It is a technical role that also involves interfacing with people across different Business Units within Sattva. It offers the opportunity to work with best-in-class SaaS solutions like Google Workspace, FreshTeams, ClickUp, and QuickBooks.
Responsibilities
● Analyse existing landscape of SaaS solutions to identify automation gaps in key
business process
● Integrate best-of-breed SaaS solutions using APIs and Low/No-Code tools
● Build apps to extend existing SaaS solutions like FreshTeams, QuickBooks, ClickUp, etc
using available APIs and SDKs
● Configure SaaS solutions to meet the needs of a specific Business Unit or of a defined
security policy
● Build Slack apps to integrate with SaaS solutions in the landscape
● Troubleshoot technical issues with the configured solutions in the landscape
Ideal Candidate Profile
● 1+ years of experience in integrating/extending SaaS solutions
● Solid expertise in developing automation scripts and applications using Javascript or
Python
● Strong problem-solving ability
● Excellent communication skills
● Proven ability to interface with multiple stakeholders across business vertical
Your mission is to help lead team towards creating solutions that improve the way our business is run. Your knowledge of design, development, coding, testing and application programming will help your team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit.
Responsibilities and Duties :
- As a Data Engineer you will be responsible for the development of data pipelines for numerous applications handling all kinds of data like structured, semi-structured &
unstructured. Having big data knowledge specially in Spark & Hive is highly preferred.
- Work in team and provide proactive technical oversight, advice development teams fostering re-use, design for scale, stability, and operational efficiency of data/analytical solutions
Education level :
- Bachelor's degree in Computer Science or equivalent
Experience :
- Minimum 3+ years relevant experience working on production grade projects experience in hands on, end to end software development
- Expertise in application, data and infrastructure architecture disciplines
- Expert designing data integrations using ETL and other data integration patterns
- Advanced knowledge of architecture, design and business processes
Proficiency in :
- Modern programming languages like Java, Python, Scala
- Big Data technologies Hadoop, Spark, HIVE, Kafka
- Writing decently optimized SQL queries
- Orchestration and deployment tools like Airflow & Jenkins for CI/CD (Optional)
- Responsible for design and development of integration solutions with Hadoop/HDFS, Real-Time Systems, Data Warehouses, and Analytics solutions
- Knowledge of system development lifecycle methodologies, such as waterfall and AGILE.
- An understanding of data architecture and modeling practices and concepts including entity-relationship diagrams, normalization, abstraction, denormalization, dimensional
modeling, and Meta data modeling practices.
- Experience generating physical data models and the associated DDL from logical data models.
- Experience developing data models for operational, transactional, and operational reporting, including the development of or interfacing with data analysis, data mapping,
and data rationalization artifacts.
- Experience enforcing data modeling standards and procedures.
- Knowledge of web technologies, application programming languages, OLTP/OLAP technologies, data strategy disciplines, relational databases, data warehouse development and Big Data solutions.
- Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Skills :
Must Know :
- Core big-data concepts
- Spark - PySpark/Scala
- Data integration tool like Pentaho, Nifi, SSIS, etc (at least 1)
- Handling of various file formats
- Cloud platform - AWS/Azure/GCP
- Orchestration tool - Airflow