

Similar jobs
Required Skills
- 4-6 years of professional experience building server applications with Node.js and the Spring framework.
- Experience working with the AWS Development stack, including: Lambda, API Gateway, DynamoDB, Cognito, Mobile Analytics, EC2 and RDS
Experience working with a handful of the following:
- relational databases: MySQL, or PostgreSQL or Oracle
- document-based data stores: MongoDB or CouchDB,
- key-value stores: DynamoDB, Redis, Memcached
- column-stores: Cassandra, Vertica
- Ability to work with front-end web technologies, like: HTML5/CSS3/Angular/Bootstrap
- Strong familiarity with *nix command line operations
● Creating RESTful API with Node.js
● Collaborating with front-end developers on the integration of elements.
● Implementing effective security protocols, data protection measures, and
storage solutions.
● Maintaining all the required documents for your project.
● Constantly coming up with new ideas and also implementing them to
improve the app’s performance.
● Define and communicate technical and design requirements.
● Learn about new technologies and stay up to date with current best
practices.
● Create Unit and Integration tests to ensure the quality of code
Requirements
● Knowledge of the database and familiarity with the schema design in
NoSQL (i.e MongoDB)
● Knowledge of Relational databases like MySQL will be preferred.
● A good understanding of the Software Development Lifecycle
● Knowledge of API design and development using REST
● Knowledge of version control systems like Git.
● Good understanding of object-oriented programming(OOP) and OOP
patterns.
● Again, You don’t have to know it all in-depth but you should know how to
dig the internet for finding the solutions.
About the Company –
OTO is re-imagining how 20M two-wheelers are bought & financed every year in India. We aim to simplify
the complete experience of buying and owning a two-wheeler by building products that are customer-centric and scalable. We are backed by some of the best Fintech Experts and Institutional Funds in the
country
Designation - Assistant Manager - Sales Trainer
Skills and Qualifications
Bachelor’s Degree.
3 to 5 years of experience.
Experience developing training materials.
Attention to detail.
Excellent written and oral communication skills
Fluency in Hindi is a must.
Job Responsibilities and Duties:
Complete training for all new hires and continual training for the entire sales staff.
Discipline and provide coaching to salespeople as necessary.
Engage in one-on-one reviews with salespeople to maximize performance.
Strategize new training methods to be implemented.
Partner with sales and marketing departments to set and progress toward goals.
Collect and interpret data to determine the effectiveness of training techniques.
Write and publish training materials.

1) Should be able to author low level design documents or implementation approach document. 2) Should be able to understand industry best practices and process associated with software development. 3) Should participate in design discussions, backlog grooming, estimations and other scrum ceremonies. 4) Should lead the development team on technical solutions, POCs, quality assurance, and timely delivery. |


What’s the job? Dot Net Developer
Where do I work? GyanMatrix
Experience 8-11 Yrs
Why are we here?
Mandatory skills
to have:
We are looking for a Dotnet Developer, who is motivated to combine the art of design
with the art of programming. Responsibilities will include implementing visual
elements and their behaviors with user interactions. You will work on Fullstack
developers to build all client-side logic. You will also be bridging the gap between the
visual elements and the server-side infrastructure, taking an active role on both sides,
and defining how the application looks and functions.
5+ years of experience in Classic ASP and JSP.
Should have 2+ years of experience in Tomcat Administration.
Should have 3 years of experience in SQL Server.
Good Coding and Designing Skills. Good Application Troubleshooting skills.
SQL Server 2019 – SQL Queries, procedures and functions.
Ability to interpret business requirements and translate into working software.
Web and UX proficiency.
Primary Responsibilities:
- We need strong SQL database development skills using the MS SQL server.
- Strong skill in SQL server integration services (SSIS) for ETL development.
- Strong Experience in full life cycle database development project using SQL server.
- Experience in designing and implementing complex logical and physical data models.
- Exposure to web services & web technologies (Javascript, Jquery, CSS, HTML).
- Knowledge of other high-level languages (PERL, Python) will be an added advantage
- Nice to have SQL certification.
Good to have:
• Bachelor’s degree or a minimum of 3+ years of formal industry/professional experience in software development – Healthcare background preferred.

Introduction
http://www.synapsica.com/">Synapsica is a https://yourstory.com/2021/06/funding-alert-synapsica-healthcare-ivycap-ventures-endiya-partners/">series-A funded HealthTech startup founded by alumni from IIT Kharagpur, AIIMS New Delhi, and IIM Ahmedabad. We believe healthcare needs to be transparent and objective while being affordable. Every patient has the right to know exactly what is happening in their bodies and they don't have to rely on cryptic 2 liners given to them as a diagnosis.
Towards this aim, we are building an artificial intelligence enabled cloud based platform to analyse medical images and create v2.0 of advanced radiology reporting. We are backed by IvyCap, Endia Partners, YCombinator and other investors from India, US, and Japan. We are proud to have GE and The Spinal Kinetics as our partners. Here’s a small sample of what we’re building: https://www.youtube.com/watch?v=FR6a94Tqqls">https://www.youtube.com/watch?v=FR6a94Tqqls
Your Roles and Responsibilities
We are looking for an experienced MLOps Engineer to join our engineering team and help us create dynamic software applications for our clients. In this role, you will be a key member of a team in decision making, implementations, development and advancement of ML operations of the core AI platform.
Roles and Responsibilities:
- Work closely with a cross functional team to serve business goals and objectives.
- Develop, Implement and Manage MLOps in cloud infrastructure for data preparation,deployment, monitoring and retraining models
- Design and build application containerisation and orchestrate with Docker and Kubernetes in AWS platform.
- Build and maintain code, tools, packages in cloud
Requirements:
- At Least 2+ years of experience in Data engineering
- At Least 3+ yr experience in Python with familiarity in popular ML libraries.
- At Least 2+ years experience in model serving and pipelines
- Working knowledge of containers like kubernetes , dockers, in AWS
- Design distributed systems deployment at scale
- Hands-on experience in coding and scripting
- Ability to write effective scalable and modular code.
- Familiarity with Git workflows, CI CD and NoSQL Mongodb
- Familiarity with Airflow, DVC and MLflow is a plus
What you will do:
- Building exciting and high impact features for the small business community such as inventory management, billing, accounting, tax filing
- Having high user empathy, looking at the entire flow, taking care of all the edge cases and is creative in problem solving
- A customer-first approach - you must always think customer first and be absolutely unwilling to make compromises on customer experience.
- Past experience in creating products for small and medium businesses will be preferred
Responsibilities for Data Engineer
- Create and maintain optimal data pipeline architecture,
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and organizational skills.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

