
Similar jobs
JOB DESCRIPTION – Scan-to-BIM Engineer (LiDAR)
Location: Dwarka, New Delhi
Experience: 1–12 Years
Role Overview:
Scan-to-BIM Engineer will support the LiDAR (ALS/MLS/TLS) Scan-to-BIM Expert in processing point cloud datasets, developing BIM models, preparing engineering drawings, and assisting in the delivery of infrastructure as-built documentation.
The role is designed for candidates looking to grow into a full Scan-to-BIM expert role through hands-on project exposure, structured technical guidance, and multi-domain geospatial-BIM integration experience.
Key Responsibilities:
• Assist in processing LiDAR datasets including registration, filtering, alignment, classification, and QA checks.
• Support creation of LOD 200–400 BIM models for bridges, railways, roads, utilities, and built infrastructure.
• Extract sections, plans, profiles, and structural elements from point clouds under expert supervision.
• Assist in preparing bridge inspection drawings, corridor as-built documentation, and clearance studies.
• Aid in integrating mesh files with BIM geometry and CAD drawings.
• Maintain organized datasets, model versions, documentation, and quality reports.
• Coordinate with GIS, CAD, field-survey, and QA teams for seamless workflow support.
• Learn and operate software such as Revit, Civil 3D, Navisworks, CloudCompare, ReCap, TerraScan, etc.
Key Performance Indicators (KPIs):
• Accuracy and completeness of assigned modelling and extraction tasks.
• Timely completion of work packages as delegated by senior experts.
• Reduction in errors through consistent QA adherence.
• Improvement in modelling speed and competency over time.
• Documentation quality and maintenance of clean data workflows.
• Contribution to team communication and project progress.
Preferred Qualifications & Skills:
• Diploma/B.Tech in Civil Engineering, Geomatics, Architecture, GIS, or related fields.
• Basic understanding of LiDAR, point clouds, or BIM workflows preferred.
• Working knowledge of Revit, AutoCAD, or point cloud tools is an advantage.
• Strong technical aptitude, willingness to learn, and attention to detail.
Key Responsibilities:
· Lead the design and implementation of scalable infrastructure using IaC principles.
· Develop and manage configuration management tools primarily with Chef.
· Write and maintain automation scripts in Python to streamline infrastructure tasks.
· Build, manage, and version infrastructure using Terraform.
· Collaborate with cloud architects and DevOps teams to ensure highly available, secure, and scalable systems.
· Provide guidance and mentorship to junior engineers.
· Monitor infrastructure performance and provide optimization recommendations.
· Ensure compliance with best practices for security, governance, and automation.
· Maintain and improve CI/CD pipelines with infrastructure integration.
· Support incident management, troubleshooting, and root cause analysis for infrastructure issues.
Required Skills & Experience:
· Strong hands-on experience in:
o Chef (Cookbooks, Recipes, Automation)
o Python (Scripting, automation tasks, REST APIs)
o Terraform (Modules, state management, deployments)
· Experience in AWS services (EC2, VPC, IAM, S3, etc.)
· Familiarity with Windows administration and automation.
· Solid understanding of CI/CD processes, infrastructure lifecycle, and Git-based workflow
Build AI Systems That Change How Industries Operate
Tailored AI is not just another tech company. We’re building the McKinsey of AI systems a new kind of firm, made up of engineers who understand business deeply and use AI as a force multiplier.
As an SDE 2, you’ll lead and own the engineering for an entire product track, often working directly with clients and stakeholders. You’ll be the architect, the executor, and the problem-solver-in-chief. You’ll take vague problem statements, turn them into elegant solutions, and bring them to life in production.
What You’ll Do
- Architect and build AI-powered software solutions from scratch
- Own a full engineering track—backend, infra, integrations, and LLM workflows
- Interface with customers to align on specs, iterate fast, and deploy with confidence
- Mentor SDE 1s and Interns, conduct code reviews, and guide engineering quality
- Stay on top of AI trends, contribute to internal tooling and shared best practices
What You’ll Gain
- Leadership opportunties and fast progression to Senior SDE roles
- Deep knowledge of how AI is transforming industries while actually building it
- High ownership, zero bureaucracy, and direct influence on product direction
- Exposure to multi-agent AI systems, enterprise integrations, and scalable infra
Who You Are
- 2–3 years of strong backend engineering experience
- Proven track record of owning software modules and delivering in production
- Skilled in Python, Django/FastAPI, Postgres, AWS
- Exposure to system design and performance optimization
- Interest in AI tools like Langchain, OpenAI, vector DBs, etc.
- Strong analytical and communication skills
Tech Stack You’ll Work With
- Python, Django, FastAPI
- Postgres, Redis, S3
- EC2, Lambda, Cloudwatch
- Langchain, LLM APIs, Vector DBs
- REST APIs, Microservices, GitHub Actions
Some Real Problems You Might Work On
- Building a multi-agent career coaching assistant that guides users and automates job hunting
- Deploying a chatbot that generates employee performance reviews on-demand from HR data
- Designing an LLM pipeline to help Indian lawyers access precedents, statutes, and case law in seconds
Interview Process
- Screening – A quick call with a Co-Founder to align on fit
- CV + Puzzle + Programming – 1 hour round to gauge problem-solving and fundamentals
- Live Coding – Solve a coding task using Python + docs
- System Design – For SDE 2, a take-home problem and a detailed discussion round
Past experience in working with any product start-up is a plus.
● Proficiency in fundamental front end languages such as HTML, CSS, and JavaScript.
● Familiarity with JavaScript frameworks such as Angular JS, React, and others.
● Knowledge of Node.js and frameworks available for it such as Express (recommended).
● Good understanding of asynchronous request handling, partial page updates, and AJAX
● Understanding of REST Services.
● Experience with Redis.
● Familiarity with database technology such as MySQL, MongoDB and Elasticsearch.
● Proficient understanding of serverless programming.
● Proficient understanding of code versioning tools, such as Git.
● Good problem-solving skills.
● Attention to detail.
● Demonstrated experience as a software engineer, with at least 5-8 years’ experience
in technology roles
● Experience working on complex systems and cloud architectures, preferably in a B2B
or enterprise context
● Significant experience with the Java programming language and frameworks such as
Spring & SpringBoot
● Good working experience with front-end Javascript frameworks such as ReactJS
● Experience optimizing databases and SQL queries for high-performance
● Good knowledge of AWS services, design patterns and practices - ideally certified
● Experience and keen understanding of the value of working in agile teams
● A “quality-first” mindset, with experience working in continuous integration
environments
● Highly effective at communicating, and comfortable whiteboarding design ideas with
teams of engineers, product managers, and business analysts
● Desire to challenge the status quo and maturity to know when to compromise
● Respect for other team members and a highly collaborative approach to working and
learning together
-
Owns the end to end implementation of the assigned data processing components/product features i.e. design, development, dep
loyment, and testing of the data processing components and associated flows conforming to best coding practices -
Creation and optimization of data engineering pipelines for analytics projects.
-
Support data and cloud transformation initiatives
-
Contribute to our cloud strategy based on prior experience
-
Independently work with all stakeholders across the organization to deliver enhanced functionalities
-
Create and maintain automated ETL processes with a special focus on data flow, error recovery, and exception handling and reporting
-
Gather and understand data requirements, work in the team to achieve high-quality data ingestion and build systems that can process the data, transform the data
-
Be able to comprehend the application of database index and transactions
-
Involve in the design and development of a Big Data predictive analytics SaaS-based customer data platform using object-oriented analysis
, design and programming skills, and design patterns -
Implement ETL workflows for data matching, data cleansing, data integration, and management
-
Maintain existing data pipelines, and develop new data pipeline using big data technologies
-
Responsible for leading the effort of continuously improving reliability, scalability, and stability of microservices and platform
- Working with the product team to develop new features focused on improving the user experience
- Improving existing features and working on streamlining client implementations through improved tools
- Improving our technical architecture and building out a continuous integration pipeline
- Modernizing our front-end in new frameworks
- Everything else - our team is small and you'll likely be involved in almost every tech-related thing going on
Requirements
- 1+ years of experience building consumer-facing web apps
- You are interested in the full-stack opportunity and love building a feature from start to finish
- Self-starter with a deep interest in tech - we want someone who will come in with opinions, and shape our engineering practices and decisions for the better
- An eye for design - you'll have an important role in making the product look great
- Believe in our mission and love the idea of working in education to help students succeed
Technologies you'll work with:
- NodeJS, http://sails.js/">Sails.JS
- http://react.js/">React.js, Redux, Redux-Saga
- Kubernetes, Postgres, ElasticSearch, Redis, RabbitMQ
- Whatever else you decide - we're constantly re-evaluating our stack and tools
- Having prior experience with the technologies is a plus, but not mandatory for skilled candidates.
Benefits
- Remote Option - You can work from any location of your choice
- Reimbursement of Home Office Setup
- Competitive Salary
- Friendly atmosphere
- Flexible paid vacation policy
Location: Bangalore
We are looking for the right Backend Developer.
What you will work on Build a scalable API platform that will enhance our customer experience & propel our logistics. You will be part of our Bangalore team of ambitious and talented engineers, who put their best together to build architecturally sound & scalable systems.
What can CasaOne promise you –
An opportunity to - increase your rate of learning exponentially by defining hard problems and solving them - partake in a high-growth journey and increase revenues 5x+ Y-o-Y - be an early innovator in the shifting trend: ‘ownership economy’ -> ‘access economy’ - build a category-defining platform for FF&E (Furniture, Fixture, and Equipment) leasing - build high-performance teams
The must-haves
• Bachelor’s or Master’s degree in engineering
• Good understanding of algorithms, data structures & design patterns
• A minimum of 4 years of work experience Experience required in
• Building distributed systems & service-oriented architecture
• Asynchronous programming, Test Driven Development (TDD)
• Writing (delightful) APIs & integration patterns
• RDBMS & NoSql databases
• Continuous integration & deployment (CI/CD) tools like git, Jenkins
• Cloud computing platforms - AWS/ Azure/ Google Cloud
Good to know CasaOne backend services are written in NodeJS. Experience in NodeJS will be handy, but it isn’t mandatory.









