
Job Description
Site Engineer – Civil Maintenance, Repair, and Operations (MRO)
Job Summary:
The Site Engineer for MRO is responsible for assisting in the maintenance, repair,
and operational activities of civil infrastructure and facilities. This role supports senior
engineers and maintenance teams to ensure that all civil works are executed efficiently,
safely, and in compliance with relevant standards.
Key Responsibilities:
∙Assist in the inspection and assessment of buildings, roads, drainage systems, and
other civil structures.
∙Prepare and maintain maintenance schedules for routine and preventive tasks.
∙Coordinate with contractors and vendors for repair and maintenance work.
∙Monitor and ensure the quality and safety of ongoing repair and maintenance
activities.
∙Maintain records of repairs, inspections, and material usage.
∙Assist in budgeting and cost estimation for civil MRO works.
∙Provide technical support during emergency repairs or structural failures.
∙Support in the procurement of materials and tools required for civil maintenance.
∙Ensure compliance with safety and environmental regulations.
Skills and Qualifications:
∙Bachelor’s or Diploma in Civil Engineering.
∙2–3 years of experience in civil maintenance or construction (internships count).
∙Familiarity with construction materials, repair techniques, and building codes.
∙Proficient in AutoCAD, MS Office, and project management tools.
∙Good communication and reporting skills.
∙Willingness to work on-site and respond to emergency repair calls.

About UCC Infra
About
Company social profiles
Similar jobs
Responsibilities:
• REVIT Families creation and Modelling: Create, modify, and maintain REVIT families to be used in architectural projects, including furniture, fixtures, equipment, and building components. Draft the furniture layouts, finish plans, elevations and details of Hospitality projects in U.S.
• Collaboration: Work closely with architects, designers, and other team members to understand project requirements and translate them into accurate REVIT families.
• Quality Control: Ensure that REVIT families meet industry standards, project-specific needs, and BIM best practices.
• Documentation: Maintain organized records of REVIT families and their associated parameters for easy access and retrieval.
• BIM Standards: Adhere to and help develop company-specific BIM standards and guidelines for family creation and management.
• BIM Integration: Assist in integrating REVIT families into project models, coordinating with other team members to ensure seamless BIM workflows.
• Model Review: Participate in model reviews to identify and resolve any issues related to REVIT families, ensuring model accuracy and consistency.
• Stay Updated: Keep up-to-date with industry trends and advancements in BIM technology, especially as it pertains to REVIT.
Qualifications:
• Bachelor's degree in Architecture, Engineering, or a related field.
• 0 to 3 years of experience in architectural practice with a strong focus on BIM and REVIT family creation , modelling and drafting.
• Proficiency in Autodesk REVIT software and demonstrated expertise in creating and managing REVIT families.
• Strong understanding of BIM principles and workflows.
• Knowledge of BIM standards and best practices.
• Detail-oriented with excellent organizational skills.
• Good communication and collaboration skills.
• Ability to work independently and as part of a team.
• Willingness to learn and adapt to evolving BIM technologies and processes.
The Sr AWS/Azure/GCP Databricks Data Engineer at Koantek will use comprehensive
modern data engineering techniques and methods with Advanced Analytics to support
business decisions for our clients. Your goal is to support the use of data-driven insights
to help our clients achieve business outcomes and objectives. You can collect, aggregate, and analyze structured/unstructured data from multiple internal and external sources and
patterns, insights, and trends to decision-makers. You will help design and build data
pipelines, data streams, reporting tools, information dashboards, data service APIs, data
generators, and other end-user information portals and insight tools. You will be a critical
part of the data supply chain, ensuring that stakeholders can access and manipulate data
for routine and ad hoc analysis to drive business outcomes using Advanced Analytics. You are expected to function as a productive member of a team, working and
communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers. Requirements:
Strong experience as an AWS/Azure/GCP Data Engineer and must have
AWS/Azure/GCP Databricks experience. Expert proficiency in Spark Scala, Python, and spark
Must have data migration experience from on-prem to cloud
Hands-on experience in Kinesis to process & analyze Stream Data, Event/IoT Hubs, and Cosmos
In depth understanding of Azure/AWS/GCP cloud and Data lake and Analytics
solutions on Azure. Expert level hands-on development Design and Develop applications on Databricks. Extensive hands-on experience implementing data migration and data processing
using AWS/Azure/GCP services
In depth understanding of Spark Architecture including Spark Streaming, Spark Core, Spark SQL, Data Frames, RDD caching, Spark MLib
Hands-on experience with the Technology stack available in the industry for data
management, data ingestion, capture, processing, and curation: Kafka, StreamSets, Attunity, GoldenGate, Map Reduce, Hadoop, Hive, Hbase, Cassandra, Spark, Flume, Hive, Impala, etc
Hands-on knowledge of data frameworks, data lakes and open-source projects such
asApache Spark, MLflow, and Delta Lake
Good working knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Hands-on experience in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Experience preparing data for Data Science and Machine Learning with exposure to- model selection, model lifecycle, hyperparameter tuning, model serving, deep
learning, etc
Demonstrated experience preparing data, automating and building data pipelines for
AI Use Cases (text, voice, image, IoT data etc. ). Good to have programming language experience with. NET or Spark/Scala
Experience in creating tables, partitioning, bucketing, loading and aggregating data
using Spark Scala, Spark SQL/PySpark
Knowledge of AWS/Azure/GCP DevOps processes like CI/CD as well as Agile tools
and processes including Git, Jenkins, Jira, and Confluence
Working experience with Visual Studio, PowerShell Scripting, and ARM templates. Able to build ingestion to ADLS and enable BI layer for Analytics
Strong understanding of Data Modeling and defining conceptual logical and physical
data models. Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud
architectures, pros/cons, and migration considerations. Ability to remain up to date with industry standards and technological advancements
that will enhance data quality and reliability to advance strategic initiatives
Working knowledge of RESTful APIs, OAuth2 authorization framework and security
best practices for API Gateways
Guide customers in transforming big data projects, including development and
deployment of big data and AI applications
Guide customers on Data engineering best practices, provide proof of concept, architect solutions and collaborate when needed
2+ years of hands-on experience designing and implementing multi-tenant solutions
using AWS/Azure/GCP Databricks for data governance, data pipelines for near real-
time data warehouse, and machine learning solutions. Over all 5+ years' experience in a software development, data engineering, or data
analytics field using Python, PySpark, Scala, Spark, Java, or equivalent technologies. hands-on expertise in Apache SparkTM (Scala or Python)
3+ years of experience working in query tuning, performance tuning, troubleshooting, and debugging Spark and other big data solutions. Bachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Basic experience with or knowledge of agile methodologies
AWS Certified: Solutions Architect Professional
Databricks Certified Associate Developer for Apache Spark
Microsoft Certified: Azure Data Engineer Associate
GCP Certified: Professional Google Cloud Certified
- Strong experience in developing enterprise web applications.
- Expertise on the Microsoft technology stack (.NET, C#, ASP.NET, MVC, Web API, SQL Server)
- Candidates should have solid understanding of object-oriented programming and programming principles NET, C ,MVC, XML, MS Sql Database debugging and analysing code.
- Candidates should have experience with SDLC to include the following stages requirements analysis design coding testing documentation and implementation.
- Experience with .Net core Web API MVC C# Entity Framework
- Troubleshooting & analysing skills.
- Development experience required for Continuous improvement program.
- Good problem-solving capabilities in Developed applications.
- Exposure to Angular is an added advantage
Not just a delivery company
RARA NOW is revolutionising instant delivery for e-commerce in Indonesia through data-driven logistics.
RARA NOW is making instant and same-day deliveries scalable and cost-effective by leveraging a differentiated operating model and real-time optimisation technology. RARA makes it possible for anyone, anywhere to get same day delivery in Indonesia. While others are focusing on one-to-one' deliveries, the company has developed proprietary, real-time batching tech to do many-to-many' deliveries within a few hours. RARA is already in partnership with some of the top eCommerce players in Indonesia like Blibli, Sayurbox, Kopi Kenangan and many more.
We are a distributed team with the company headquartered in Singapore, core operations in Indonesia and technology team based out of India
Future of eCommerce Logistics.
Data driven logistics company that is bringing in same day delivery revolution in Indonesia
Revolutionising delivery as an experience
Empowering D2C Sellers with logistics as the core technology
**About the Role**
Integration of user-facing elements developed by front-end developers with server side logic
Implementation of security and data protection
Integration of data storage solutions
Strong proficiency with JavaScript
Knowledge of Node.js and frameworks available for it
Understanding the nature of asynchronous programming and its quirks and workarounds
Good understanding of server-side templating languages and CSS preprocessor
Basic understanding of front-end technologies, such as HTML5 and CSS3
User authentication and authorization between multiple systems, servers, and environments
Understanding differences between multiple delivery platforms, such as mobile vs. desktop, and optimizing output to match the specific platform
Implementing automated testing platforms and unit test
Strong technical development experience in effectively writing code, performing code reviews, and implementing best practices on configuration management and code refactoring
Experience in working with vendor applications
Experience in making optimized queries to MySQL database
Proven problem solving and analytical skills
A delivery-focused approach to work and the ability to work without direction
Experience in Agile development techniques, including Scrum
Experience implementing and/or using Git
Ability to work collaboratively in teams and develop meaningful relationships to achieve common goals
Bachelor degree in Computer Science or related discipline preferred
We are looking for an experienced Dot Net/angular Developers to join our growing and diverse team.
Mandatory Skills: DotNet(C#/ASP), MVC, Web API, SQL/PostgreSQL, JavaScript, Angular.
Experience: 3yrs to 5yrs
Responsibilities:
Build components that are scalable, flexible and compatible across different frameworks and browsers. Several of the frameworks, tools and techniques currently used
in our technical solutions:
Frontend development using VueJS or similar framework such as ReactJS or Angular.
Backend development using .NET Core.
DevOps work using GCP and/or AWS.Hosting using Docker/Containers.
Message queue such as RabbitMQ / AWS SQS.
Storage and events using Postgres/SQL, EventStore , ElasticSearch , Redis/NoSQL. Continuous development and deployment using tools like GitHub, TeamCity, Octopus Deploy.
Respond to trouble/support calls per SLA for applications in production to make quick repair to keep application in production. Conducts complex analyses trouble shooting issues.
Participate in code reviews to ensure that each increment adheres to user story and all standard resource libraries and architecture patterns as appropriate.
Be part of our onshore and offshore development team working on our API Management solution delivering business value with active participation in our SAFe development processes.
Create or Update documentation in support of development efforts.
Qualifications:
Undergraduate degree in Computer Science and/or equivalent experience.
Experience defining technical expectations and goals of projects
Development experience with Microsoft web technologies and database programming
Experience with C#, ASP.net, SQL, XML, COM, JavaScript, VBScript, HTML, Site Server
Good experience on Front end technologies: JavaScript, Typescript, Modern JavaScript Frameworks.
Working knowledge of GIT is good to have.
Demonstrated interest in, knowledge of, and enthusiasm for Internet technologies.
Enhancement of existing applications in terms of new features and technologies available to ensure higher security, stability, speed and user friendliness of applications.
Knowledge of .Net 2.0, 3.5 and 4.0 coding using C#.
Knowledge of IIS 6.0, IIS 7.0 and IIS 8 and above.
Hands on experience of MS SQL 2012/2016 database programming & Management, writing complex SQL queries and procedures.
Experience of designing applications interfacing in .Net and MS SQL.
Java experience would be an added advantage.
Knowledge of API, Web Services and Ajax controls.
Should be able to integrate .Net applications with API.
Should have knowledge/experience of cross browser development.
Should be sound with Object Oriented Programming Concepts
ATM domain knowledge will be an added advantage.
Job Description - Data Engineer
About us
Propellor is aimed at bringing Marketing Analytics and other Business Workflows to the Cloud ecosystem. We work with International Clients to make their Analytics ambitions come true, by deploying the latest tech stack and data science and engineering methods, making their business data insightful and actionable.
What is the role?
This team is responsible for building a Data Platform for many different units. This platform will be built on Cloud and therefore in this role, the individual will be organizing and orchestrating different data sources, and
giving recommendations on the services that fulfil goals based on the type of data
Qualifications:
• Experience with Python, SQL, Spark
• Knowledge/notions of JavaScript
• Knowledge of data processing, data modeling, and algorithms
• Strong in data, software, and system design patterns and architecture
• API building and maintaining
• Strong soft skills, communication
Nice to have:
• Experience with cloud: Google Cloud Platform, AWS, Azure
• Knowledge of Google Analytics 360 and/or GA4.
Key Responsibilities
• Work on the core backend and ensure it meets the performance benchmarks.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
Key Responsibilities
• Design and develop platform based on microservices architecture.
• Work on the core backend and ensure it meets the performance benchmarks.
• Work on the front end with ReactJS.
• Designing and developing APIs for the front end to consume.
• Constantly improve the architecture of the application by clearing the technical backlog.
• Meeting both technical and consumer needs.
• Staying abreast of developments in web applications and programming languages.
What are we looking for?
An enthusiastic individual with the following skills. Please do not hesitate to apply if you do not match all of it. We are open to promising candidates who are passionate about their work and are team players.
• Education - BE/MCA or equivalent.
• Agnostic/Polyglot with multiple tech stacks.
• Worked on open-source technologies – NodeJS, ReactJS, MySQL, NoSQL, MongoDB, DynamoDB.
• Good experience with Front-end technologies like ReactJS.
• Backend exposure – good knowledge of building API.
• Worked on serverless technologies.
• Efficient in building microservices in combining server & front-end.
• Knowledge of cloud architecture.
• Should have sound working experience with relational and columnar DB.
• Should be innovative and communicative in approach.
• Will be responsible for the functional/technical track of a project.
Whom will you work with?
You will closely work with the engineering team and support the Product Team.
Hiring Process includes :
a. Written Test on Python and SQL
b. 2 - 3 rounds of Interviews
Immediate Joiners will be preferred
Should have knowledge of web frameworks, such as Django, HTML and CSS and Python programming language.
Should have relevant work experience in coding and web development.
Good knowledge of Python and Django
Good knowledge of HTML5, CSS3 and Javascript,
Good knowledge of MY SQL database
Knowledge of Django REST framework, API.
Education Qualification- bachelor's degree in computer science, information management systems or a related field.
Work Experience - 1 to 2 years (Required)
Job Type: Full-time
Salary: ₹15,000.00 - ₹25,000.00 per month
Responsibilities:
- Design and deliver scalable web services, APIs and backend data modules.
- Understand requirements and develop reusable code using design patterns & component architecture and write unit test cases
- Collaborate with product management and engineering teams to elicit and understand the requirements and develop solutions
- Stay current with latest tools, technology ideas and methodologies; share knowledge by clearly articulating results and ideas to key decision makers.
Required Qualifications:
- 6+ years of experience writing multithreaded programs running in Java
- Experience on Java, Spring Boot,, Apache Nifi , workingDocker, EKS, Azkaban, Jenkins
- Experience with Git and build tools like Gradle/Maven/SBT.
- Strong understanding of object-oriented design, data structures, algorithms, profiling, and optimization.
- Have elegant, readable, maintainable and extensible code style.
- Experience on AWS is preferable
- Knowledge on top algorithms like sorting, heap/stack, queue, search, etc.
- Familiarity with test-driven development
- Thrive in a fast-paced environment, with ability to deliver code of quality quickly.
- Attention to details. Strong communication and collaboration skills.
- BS in Computer Science or equivalent

- Good experience on RESTful services
- .Hands on experience on NodeJs
- .Should have previous working experience on Sql and Postgresql
- Must have relevant experience in NodeJS from 3-5years.










