Project cum sales (Techno Commercial)- ALC wall panel
Job Title: Project cum Sales (Techno-Commercial) – ALC Wall Panels
Experience: Minimum 5 years in ALC wall panels or related industry
Location: Bangalore / Hyderabad / Chennai
Salary: Up to ₹10 LPA (based on experience and fit)
Industry: Building Materials / Construction / Precast Panels
Job Overview:
We are looking for a dynamic and experienced Techno-Commercial Professional to handle project coordination and sales for ALC Wall Panels. The ideal candidate should possess strong technical knowledge of wall panel systems, a proven sales track record, and hands-on project execution experience in the construction or precast industry.
Key Responsibilities:
Sales & Business Development
- Identify and generate business opportunities in residential, commercial, and industrial construction sectors.
- Develop and maintain relationships with architects, builders, contractors, and consultants.
- Conduct product presentations and site demos.
- Prepare and deliver techno-commercial proposals and quotations.
- Drive sales targets and market penetration across assigned regions.
Project Coordination & Execution
- Coordinate end-to-end execution of ALC panel projects: planning, site supervision, installation, and quality control.
- Act as a liaison between the client, design team, and site team to ensure timely execution.
- Manage logistics, delivery schedules, and manpower coordination.
- Provide on-site technical support and resolve issues during installation.
Techno-Commercial Responsibilities
- Evaluate client requirements and offer optimal technical solutions.
- Prepare BOQs, project estimates, and cost sheets.
- Negotiate contracts, payment terms, and commercial agreements.
- Monitor receivables and ensure timely collections.
Key Requirements:
- Bachelor’s degree/Diploma in Civil Engineering or related field.
- Minimum 5 years of experience in ALC wall panels or precast wall panel industry.
- Strong knowledge of construction practices, site execution, and building material sales.
- Excellent communication, presentation, and negotiation skills.
- Willingness to travel within South India as per project and client needs.
- Proficient in MS Office, AutoCAD (basic), and project tracking tools.
mail updated resume with current salary-
email: etalenthire[at]gmail[dot]com
satish: 88 O2 74 97 43

Similar jobs
Responsibilities
· Design and architect data virtualization solutions using Denodo.
· Collaborate with business analysts and data engineers to understand data requirements and translate them into technical specifications.
· Implement best practices for data governance and security within Denodo environments.
· Lead the integration of Denodo with various data sources, ensuring performance optimization.
· Conduct training sessions and provide guidance to technical teams on Denodo capabilities.
· Participate in the evaluation and selection of data technologies and tools.
· Stay current with industry trends in data integration and virtualization.
Requirements
· Bachelor's degree in Computer Science, Information Technology, or a related field.
· 10+ years of experience in data architecture, with a focus on Denodo solutions.
· Strong knowledge of data virtualization principles and practices.
· Experience with SQL and data modeling techniques.
· Familiarity with ETL processes and data integration tools.
· Excellent communication and presentation skills.
· Ability to lead technical discussions and provide strategic insights.
· Certifications related to Denodo or data architecture are a plus
About the Role
Hudson Data is looking for a Senior / Mid-Level SQL Engineer to design, build, optimize, and manage our data platforms. This role requires strong hands-on expertise in SQL, Google Cloud Platform (GCP), and Linux to support high-performance, scalable data solutions.
We are also hiring Python Programers / Software Developers / Front end and Back End Engineers
Key Responsibilities:
1.Develop and optimize complex SQL queries, views, and stored procedures
- Build and maintain data pipelines and ETL workflows on GCP (e.g., BigQuery, Cloud SQL)
- Manage database performance, monitoring, and troubleshooting
- Work extensively in Linux environments for deployments and automation
- Partner with data, product, and engineering teams on data initiatives
Required Skills & Qualifications
Must-Have Skills (Essential)
- Expert GCP mandatory
- Strong Linux / shell scripting mandatory
Nice to Have
- Experience with data warehousing and ETL frameworks
- Python / scripting for automation
- Performance tuning and query optimization experience
Soft Skills
- Strong analytical, problem-solving, and critical-thinking abilities.
- Excellent communication and presentation skills, including data storytelling.
- Curiosity and creativity in exploring and interpreting data.
- Collaborative mindset, capable of working in cross-functional and fast-paced environments.
Education & Certifications
- Bachelors degree in Data Science, Computer Science, Statistics, Mathematics, or a related field.
- Masters degree in Data Analytics, Machine Learning, or Business Intelligence preferred.
⸻
Why Join Hudson Data
At Hudson Data, youll be part of a dynamic, innovative, and globally connected team that uses cutting-edge tools from AI and ML frameworks to cloud-based analytics platforms to solve meaningful problems. Youll have the opportunity to grow, experiment, and make a tangible impact in a culture that values creativity, precision, and collaboration.
Should be CA qualified, or an MBA Finance, similar & relevant qualification, Big4 experience preferred
Work Experience : 3-5 years of relevant experience required.
Salary Range 6-8LPA
Financial Reporting:
•Management Accounting, Reconciliation, Intercompany Reconciliation, accounts payable and receivable
•Ensure that all tax returns, tax declarations, and other required reports are submitted accurately and on time, management of Companies Act Compliances
•Responsible for oversight and providing guidance to core activities such as: fixed asset management, taxation
•Compliance with federal, state, and local legal requirements
•Responsible for preparation of monthly performance decks, sales and other reporting requirements
Hands on QuickBooks, Excel, VBA
This profile will include the following responsibilities:
- Develop Parsers for XML and JSON Data sources/feeds
- Write Automation Scripts for product development
- Build API Integrations for 3rd Party product integration
- Perform Data Analysis
- Research on Machine learning algorithms
- Understand AWS cloud architecture and work with 3 party vendors for deployments
- Resolve issues in AWS environmentWe are looking for candidates with:
Qualification: BE/BTech/Bsc-IT/MCA
Programming Language: Python
Web Development: Basic understanding of Web Development. Working knowledge of Python Flask is desirable
Database & Platform: AWS/Docker/MySQL/MongoDB
Basic Understanding of Machine Learning Models & AWS Fundamentals is recommended.
0-1 year experience
Any undergraduate/graduate can apply
Customer Service Representative
6day working
Role –Infra Management Tech Consultant
This is for you, if you:
- Love technology and have an innate interest in coding and building products.
- Aren’t afraid of taking ownership and accountability. You love challenges and won’t stop till you haven’t found a solution.
- Possess knowledge of Software development industry best practices.
- Have hands-on experience and expertise on the technical competencies needed for the job, so that you can work independently.
- Are passionate about the travel/hospitality space and can understand the nuances of hotel operations.
- Know how to have fun and smile often
Certifications Required
- AWS DevOps Engineer (professional) - equivalent or higher
- AWS Solution Architect (professional) - equivalent or higher
- AWS Developer (associate) - equivalent or higher
What will you do?
- Responsible for infrastructure for all engineering activities, manage and oversee.
- Creation & maintenance of Dev, QA, UAT, CS, Sales, Prod (cloud + private).
- Backup & Restore underlying resources for failover.
- Manage connectivity, access & security for all environments.
- Upgrade environment for OS / software / service updates, certificates & patches.
- Upgrade environments for architecture upgrades.
- Upgrade environments for cost optimisations.
- Perform Release activity for all environments.
- Monitor (& setup alerts for) underlying resources for degradation of services.
- Automate as many infra activities as possible.
- Ensure Apps are healthy - log rotation, alarms, failover, scaling, recovery.
- Ensure DBs are healthy - size, indexing, fragmentation, failover, scaling, recovery.
- Assist in debugging production issues, engage with Development team to identify & apply long term fixes.
- Explore source code, databases, logs, and traces to find or create solution.
Technical Competencies you’ll possess:
- E./B.Tech/MCA or equivalent
- 3+ years of meaningful work experience in Dev ops handling complex services
- AWS DevOps Engineer/Architect/Developer certification
- Expertise with AWS services including but not limited to EC2, ECS, S3, RDS,Lambda,VPC,OpsWork,CloudFront,Route53,CodeDeploy,SQS,SNS.
- Hands on experience in maintaining production databases - including creatingqueries for identifying bottlenecks, creating/maintaining indexes where required, de-fragmenting db
- Handsonexperience(andstrongunderstanding)onLinux&WindowsbasedOS
- Expertise with Docker and related tools
- HandsonexperiencewithinfrastructureascodetoolslikeAWSCloudFormation/Terraformandconfigurationmanagementtoolslike PuppetorChef
- Strong grasp of a modern stack protocols / technologies like –
- Headers, Caching
- IP/TCP/HTTP(S), WebSockets, SSL/TLS
- CDNs, DNS, proxies
- Expertise in setting-up and maintaining a modern stack like (on AWS & cloud)
- Version Control Systems like TFS, SVN, Gitlab servers, etc
- Application & proxy servers like Nginx, HAproxy, IIS, etc
- Database servers like MSSQL, MySQL, Postgres, Mongo, Redis, Elasticsearchetc
- Monitoring tools like Grafana, Zabbix, Influx, Prometheus, etc
- Experience in Python coding preferred
- Good understanding and experience in continuous integration/continuous deployment tools
Regards
Team Merito
- 8+ years of relevant work experience
- Well-versed in data structures, algorithms, and software design. Programming experience with at least one of
- Java or Python and object-oriented design
- Knowledge on SQL, NoSQL databases, Messaging/Caching technologies and AW deployments is a plus
- Exposure to the architecture and design (design patterns, security, reliability and scaling) of new and current systems. Experience in building highly scalable business applications, which involve implementing large complex business flows involving multiple third-party integrations
- Prior startup experience is a plus
Job title: Azure Architect
Locations: Noida, Pune, Bangalore and Mumbai
Responsibilities:
- Develop and maintain scalable architecture, database design and data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity
- Design and Develop the Data lake, Data warehouse using Azure Cloud Services
- Assist in designing end to end data and Analytics solution architecture and perform POCs within Azure
- Drive the design, sizing, POC setup, etc. of Azure environments and related services for the use cases and the solutions
- Reviews the solution requirements support architecture design to ensure the selection of appropriate technology, efficient use of resources and integration of multiple systems and technology.
- Must possess good client-facing experience with the ability to facilitate requirements sessions and lead teams
- Support internal presentations to technical and business teams
- Provide technical guidance, mentoring and code review, design level technical best practices
Experience Needed:
- 12-15 years of industry experience and at least 3 years of experience in architect role is required along with at least 3 to 4 years’ experience designing and building analytics solutions in Azure.
- Experience in architecting data ingestion/integration frameworks capable of processing structured, semi-structured & unstructured data sets in batch & real-time
- Hands-on experience in the design of reporting schemas, data marts and development of reporting solutions
- Develop batch processing, streaming and integration solutions and process Structured and Non-Structured Data
- Demonstrated experience with ETL development both on-premises and in the cloud using SSIS, Data Factory, and Azure Analysis Services and other ETL technologies.
- Experience in Perform Design, Development & Deployment using Azure Services ( Azure Synapse, Data Factory, Azure Data Lake Storage, Databricks, Python and SSIS)
- Worked with transactional, temporal, time series, and structured and unstructured data.
- Deep understanding of the operational dependencies of applications, networks, systems, security, and policy (both on-premise and in the cloud; VMs, Networking, VPN (Express Route), Active Directory, Storage (Blob, etc.), Windows/Linux).
Mandatory Skills: Azure Synapse, Data Factory, Azure Data Lake Storage, Azure DW, Databricks, Python
Data Platform engineering at Uber is looking for a strong Technical Lead (Level 5a Engineer) who has built high quality platforms and services that can operate at scale. 5a Engineer at Uber exhibits following qualities:
- Demonstrate tech expertise › Demonstrate technical skills to go very deep or broad in solving classes of problems or creating broadly leverageable solutions.
- Execute large scale projects › Define, plan and execute complex and impactful projects. You communicate the vision to peers and stakeholders.
- Collaborate across teams › Domain resource to engineers outside your team and help them leverage the right solutions. Facilitate technical discussions and drive to a consensus.
- Coach engineers › Coach and mentor less experienced engineers and deeply invest in their learning and success. You give and solicit feedback, both positive and negative, to others you work with to help improve the entire team.
- Tech leadership › Lead the effort to define the best practices in your immediate team, and help the broader organization establish better technical or business processes.
What You’ll Do
- Build a scalable, reliable, operable and performant data analytics platform for Uber’s engineers, data scientists, products and operations teams.
- Work alongside the pioneers of big data systems such as Hive, Yarn, Spark, Presto, Kafka, Flink to build out a highly reliable, performant, easy to use software system for Uber’s planet scale of data.
- Become proficient of multi-tenancy, resource isolation, abuse prevention, self-serve debuggability aspects of a high performant, large scale, service while building these capabilities for Uber's engineers and operation folks.
What You’ll Need
- 7+ years experience in building large scale products, distributed systems in a high caliber environment.
- Architecture: Identify and solve major architectural problems by going deep in your field or broad across different teams. Extend, improve, or, when needed, build solutions to address architectural gaps or technical debt.
- Software Engineering/Programming: Create frameworks and abstractions that are reliable and reusable. advanced knowledge of at least one programming language, and are happy to learn more. Our core languages are Java, Python, Go, and Scala.
- Platform Engineering: Solid understanding of distributed systems and operating systems fundamentals such as concurrency, multithreading, file systems, locking etc.
- Execution & Results: You tackle large technical projects/problems that are not clearly defined. You anticipate roadblocks and have strategies to de-risk timelines. You orchestrate work that spans multiple teams and keep your stakeholders informed.
- A team player: You believe that you can achieve more on a team that the whole is greater than the sum of its parts. You rely on others’ candid feedback for continuous improvement.
- Business acumen: You understand requirements beyond the written word. Whether you’re working on an API used by other developers, an internal tool consumed by our operation teams, or a feature used by millions of customers, your attention to details leads to a delightful user experience.









