Responsibilities:
Maintain and manage of current IT infrastructure.
Maintain and Manage current computer Network and Severs.
Provide right solution for user based query / requirement
Handle day to day IT related query / issue in given timeline.
Giving status report to Respective supervisor / Manager on daily basis.
Adherence to the expected roles and responsibilities w.r.t ISMS and QMS standards.
Required Skills:
Network and servers installation and configurations
Linux & Windows system management
Hardware and software installation and management
O365 and G-Suite Management
Client communication and follow up
Vendor communication, follow up
Educational Qualifications ,Certifications & Experience : 3 - 5 years of experience
Knowledge of Windows server and Hyper V Networking(Cisco) Technology, O365 and Gsuite
Certification: CCNA(Preferred), MCSA(Preferred), RHCE(Preferred)
Graduation: Any discipline. (Graduation in Computer Science or related field will be a big plus)

About Sigma Infosolutions
Similar jobs
About Rekise Marine
Rekise Marine is a startup focused on sustainably enhancing the utility of oceans through autonomous robotic infrastructure. Our efforts center on developing advanced autonomous technology for the maritime industry, serving both defense and commercial sectors globally. We specialize in creating autonomous vessels both surface and underwater as well as autonomous port infrastructure. Currently, we are building the flagship autonomous platform of the Indian Navy.
Key Responsibilities
* Develop AI/ML pipelines for sonar/LiDAR/Radar and camera-based perception.
* Design multi-sensor fusion frameworks for obstacle detection, seabed mapping, and environmental awareness.
* Implement real-time object detection, segmentation, and tracking for underwater missions.
* Enhance robustness of perception under low-light, turbidity, and noisy acoustic conditions.
* Apply model optimization techniques (quantization, pruning, distillation, real-time deployment tuning) to ensure efficiency on embedded and resource-constrained systems.
Preferred Skills
* Experience with deep learning frameworks (PyTorch/TensorFlow).
* Strong knowledge of signal processing, computer vision, and sensor fusion.
* Proficiency in GPU acceleration, C++/Python, ROS/ROS2.
* Track record of published research or field deployments in underwater perception.
* Demonstrable full stack experience with ML based perception (data collection, annotation, training & edge inference).
Good to Have
* Publications in top-tier robotics/AI conferences or journals (e.g., ICRA, IROS, ICAR, CVPR, ICCV, NeurIPS).
* Hands-on experience with real-world Autonomous Systems (AGV/AUV/UAV), field trials, and deployments.
Why You’ll Love Working With Us
A chance to be part of a leading marine robotics startup in India.
Competitive salary.
Flexible and innovative work environment promoting collaboration.
A role where your contributions make a real difference and drive impact.
Opportunities for travel in relation to customer interactions and field testing
relentlessly improving performance, scalability, and maintainability.
● Sound knowledge and application of algorithms and data structures.
● Proficient in Java, Springboot and Mysql.
● Able to efficiently diagnose bugs and issues.
● Understanding of when to escalate questions/issues that arise during development.
● Work with other developers, QA, DevOps and business staff to efficiently launch features
and resolve issues.
● Actively participate in design and code reviews to build robust applications and prototypes.
● A willingness to dive deep, experiment rapidly and get things done.
● Provide input to how we can continually improve our development process and knowledge.
● Define and participate in establishing better engineering practices.
● Love being challenged by learning and experimenting with new technologies
● Experience with ElasticSearch, Cassandra, Redis, Kakfa, AWS is a plus.
● Prior experience in building microservices, have worked on architectures and designs
independently is a plus.
● Prior product building experience or from a startup background would be a plus.
- Proficiency in shell scripting
- Proficiency in automation of tasks
- Proficiency in Pyspark/Python
- Proficiency in writing and understanding of sqoop
- Understanding of CloudEra manager
- Good understanding of RDBMS
- Good understanding of Excel
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
We are hiring IN OFFICE ONLY.
Office located in the beautiful Astra Towers in Rajarhat Kolkata.
We are looking for Senior Laravel Developers that have experience with Laravel Framework PHP, and VueJs.
You will be working right away with other senior Laravel Developers that know the products. You will be learning the projects, and helping them clean up and polish our tech better. We are a product based company.
A little about us:
Klizo Solutions has over 50 employees and is located in the New Town area, near CC2. Our employees range from graphic designers, back-end developers, iOS and Android Developers, and more. We are primarily a product-focused company. Our founder is from the USA and spends about 6 months a year here going back and forth. We develop technologies around his products and develop products for a few other clients. We are looking to expand to more locations, and are hiring for those reasons. You can see a company video here: https://klizos.com/about-us/
Preferred & Required knowledge and Duties:
- MVC Structure
- SQL concept
- Eloquent Relationships
- HTTP middleware
- Basic knowledge of JQuery
- Basic command of Laravel (How to create Model, Controller, migration, middleware )
- Experience of Vue.js
- Experience with Git
- Experience integrating third-party APIs
- Experience with payment APIs
Required Shifts:
We are expected to work Monday - Friday
We may rarely be needed on a Saturday if a project is required.
BONUS Knowledge (not required):
- MongoDB
- Worked with Jira
- Knows Github / Repo / Source control
- Worked with Bitbucket
Job Types: Full-time, Permanent, In Office
Benefits
- Bonuses based on performance
- Additional training will be available and may be required
- Clean office, professional hardware, and a positive team.
Senior Software Engineer, Frontend
https://www.alivecor.com/">AliveCor, the pioneer of the smartphone EKG, with millions of EKGs from a large and growing user base, seeks a Senior Software Engineer (Front-end Applications) to contribute to the Applications team. You will be an integral member of our engineering team, responsible for developing industry-leading web applications that transform the healthcare industry and affect the way consumers use, engage, and act on their health data.
- Passionate about User Experience.
- Has great verbal and written communication skills to engage directly with your peers.
- Understands and is thoughtful about engineering trade-offs
- Experience building web applications with cross-browser support and RESTful API integrations
- Commitment to delivering results.
- Passionate about continuous improvement.
- B.E in Computer Science or a related discipline, or related practical experience.
- 5-8 years of Frontend development experience with at least 2 years working as a React.js Developer.
- A strong grasp of software engineering and web application development fundamentals, including JavaScript (ES5, ES6), HTML and CSS.
- Aptitude to learn new technologies (e.g. ES.Next).
- Excellent troubleshooting skills.
- Familiarity with RESTful APIs to connect the Front-end applications to backend service
- Bonus points for:
- Publicly available web applications, portfolios, or examples available on a source repository (e.g. GitHub, GitLab, Bitbucket)
- An interest in data visualizations using JavaScript libraries such as D3, Highcharts, and others
- Experience managing front end tooling (e.g. gulp, grunt, webpack, post-CSS, service workers)
- Experience with performance analysis and tuning of web application
- Experience developing back end server APIs and/or server-side rendering
AliveCor is on a mission to define modern healthcare through data, design and disruption. We’ve pioneered the creation of FDA-cleared machine-learning techniques, transformed wearable medtech to put proactive heart care at everyone’s fingertips. Kardia is the most clinically validated mobile EKG technology. AliveCor was named as one of the Top 10 Most Innovative Companies in Health for 2017 by Fast Company as part of the publication’s annual ranking of the world’s Most Innovative Companies. AliveCor was awarded the 2015 Tech Pioneer by the World Economic Forum and one of the 50 Smartest Companies in 2015 by the MIT Technology Review. AliveCor recently announced a collaboration with Mayo Clinic that will result in new machine learning capabilities to unlock previously hidden health indicators in EKG data, potentially improving heart health as well as overall health care for a variety of conditions. AliveCor is a privately held company headquartered in Mountain View, CA.
AliveCor is an equal opportunity employer and will not discriminate against any employee or applicant on the basis of age, colour, disability, gender, national origin, race, religion, sexual orientation, or any other classification protected by federal, state, or local law.
Watch the following video demonstrating our product.
https://www.youtube.com/watch?v=8I9xosgA-Ig">KardiaMobile: How's your heart?
https://www.youtube.com/watch?v=8I9xosgA-Ig
Responsibilities:
* Implementation of a robust set of services / APIs to power the web application
* Building reusable code and libraries for future use
* Optimization of the application for maximum speed and scalability
* Implementation of security and data protection
* Integration of the front-end and back-end aspects of the web application
Requirements:
* Proficiency in developing modern web applications using Nodejs & React.
* Good understanding of Database schema, design, optimisation, scalability.
* Good understanding of server-side CSS preprocessors.
* Ability to implement automated testing platforms and unit tests.
* Great communication skills, strong work ethic.
* Ownership of the product from start to finish.
* Knowledge of code versioning tools such as Git, Mercurial or SVN.
* Ability to learn new technologies quickly.
Nice to haves:
* Experience with AWS
* Expert level understanding of the HTML DOM and underlying event model
* Exposure to Gulp, Grunt, Bootstrap
* Prior Open source contributions
* Experience building responsive designs







