For the Position of Team Leader - Data Entry Team
About Clicbrics (Redbrics ITeS India Pvt Ltd.)
“Clicbrics” is an online tech real estate marketplace for buyers across all segments. Clicbrics empowers the
consumer with data and full knowledge of properties with the help of advanced digital technologies. We have
a team of high energy technology specialist offering solutions in AI, Machine Learning, Computer vision, and all
the related technologies. Established in 2017, our expertise lies in devising holistic and end-to-end solutions in
contemporary technologies to help real estate business ideas into reality.
For more details, please login to our website www.clicbrics.com
Experience Required: 3-6Years
Qualification Required -Any Graduate with Team handling experience of atleast 1 year in Real Estate
Industry.
Job Location: Gurgaon
Skills Required:
1.Young go - getters who excel in challenging work environments and possess the drive to perform.
More than the experience, you'll need to be self - motivated, and eager to learn experiment with new
technologies.
2. Required to handle motivate data entry specialist (team of 6) to achieve Profitability.
3. Expertise in MS - Office Internet / Web research activities. Knowledge of HTML , Adobe package
PDF would be an added advantage
4. Demonstrated multi-tasking and problem-solving skills.
5. Strong business acumen and Mastery of Data Entry job duties.
Job Responsibilities:
1. Lead the team, allocate the work and at the same time work as Team player for assigned tasks.
2. Maintaining weekly, fortnightly and monthly data reports.
3. Individual will be responsible to co - ordinate with clients as per requirement.
4. Periodically review the team member s performance and provide feedback.
5. Report progress and measurements of success to Management.
6. Daily management of a team of employees to include motivating, recognizing, rewarding,
coaching, counselling, training, and problem solving.
Candidates are preferred from Real Estate background only.

About REDBRICS ITES INDIA PVT LTD
About
Connect with the team
Similar jobs
Job Title: Data Engineer
About the Role
We are looking for a highly motivated Data Engineer to join our growing team and play
a critical role in shaping the data foundation of different software platforms. This role sits
at the intersection of data engineering, product, and business stakeholders, and is
responsible for building reliable data pipelines, delivering actionable insights, and
ensuring data quality across systems.
You will work closely with internal teams and external partners to translate business
requirements into scalable data solutions, while maintaining high standards for data
integrity, performance, and usability.
Key Responsibilities
Data Engineering & Architecture
Design, build, and maintain scalable data pipelines and ETL/ELT processes
Develop and optimize data models in PostgreSQL and cloud-native
architectures
Work within AWS ecosystem (e.g., S3, Lambda, RDS, Glue, Redshift, etc.) to
support data workflows
Ensure efficient ingestion and processing of large-scale datasets
Business & Partner Integration
Collaborate directly with business stakeholders and external partners to
gather requirements and deliver reporting solutions
Translate ambiguous business needs into structured data models and
dashboards
Integrate with third-party APIs and other external data sources
Data Quality & Governance
Implement robust data validation, monitoring, and QA processes
Ensure consistency, accuracy, and reliability of data across the platform
Troubleshoot and resolve data discrepancies proactively
Reporting & Analytics Enablement
Build datasets and pipelines that power dashboards and reporting tools
Support internal teams with ad hoc analysis and data requests
Partner with product and engineering teams to embed data into the SaaS product experience
Performance & Scalability
Optimize queries, pipelines, and storage for performance and cost efficiency
Continuously improve system scalability as data volume and complexity grow
Required Qualifications
3–6+ years of experience in Data Engineering or related role
Strong proficiency in Python for data processing and scripting
Advanced experience with PostgreSQL (query optimization, schema design)
Hands-on experience with AWS data architecture (S3, RDS, Lambda, Glue,
Redshift, etc.)
Experience integrating with external APIs
Solid understanding of ETL/ELT pipelines, data modeling, and warehousing
concepts
Experience working cross-functionally with business stakeholders
Preferred Qualifications
Experience in AdTech, eCommerce, or SaaS platforms
Familiarity with BI tools (e.g., Looker, Tableau, Power BI)
Experience with workflow orchestration tools (e.g., Airflow)
Understanding of data governance and compliance best practices
Exposure to real-time or streaming data pipelines
What We’re Looking For
Strong problem-solver who can operate in a fast-paced, ambiguous
environment
Ability to balance technical depth with business context
Excellent communication skills — able to work directly with non-technical
stakeholders
Ownership mindset with a focus on execution and quality
Employment type- Contract basis
Key Responsibilities
- Design, develop, and maintain scalable data pipelines using PySpark and distributed computing frameworks.
- Implement ETL processes and integrate data from structured and unstructured sources into cloud data warehouses.
- Work across Azure or AWS cloud ecosystems to deploy and manage big data workflows.
- Optimize performance of SQL queries and develop stored procedures for data transformation and analytics.
- Collaborate with Data Scientists, Analysts, and Business teams to ensure reliable data availability and quality.
- Maintain documentation and implement best practices for data architecture, governance, and security.
⚙️ Required Skills
- Programming: Proficient in PySpark, Python, and SQL, MongoDB
- Cloud Platforms: Hands-on experience with Azure Data Factory, Databricks, or AWS Glue/Redshift.
- Data Engineering Tools: Familiarity with Apache Spark, Kafka, Airflow, or similar tools.
- Data Warehousing: Strong knowledge of designing and working with data warehouses like Snowflake, BigQuery, Synapse, or Redshift.
- Data Modeling: Experience in dimensional modeling, star/snowflake schema, and data lake architecture.
- CI/CD & Version Control: Exposure to Git, Terraform, or other DevOps tools is a plus.
🧰 Preferred Qualifications
- Bachelor's or Master's in Computer Science, Engineering, or related field.
- Certifications in Azure/AWS are highly desirable.
- Knowledge of business intelligence tools (Power BI, Tableau) is a bonus.
Job Description
Overview:
We are seeking an experienced Azure Data Engineer to join our team in a hybrid Developer/Support capacity. This role focuses on enhancing and supporting existing Data & Analytics solutions by leveraging Azure Data Engineering technologies. The engineer will work on developing, maintaining, and deploying IT products and solutions that serve various business users, with a strong emphasis on performance, scalability, and reliability.
Must-Have Skills:
Azure Databricks
PySpark
Azure Synapse Analytics
Key Responsibilities:
- Incident classification and prioritization
- Log analysis and trend identification
- Coordination with Subject Matter Experts (SMEs)
- Escalation of unresolved or complex issues
- Root cause analysis and permanent resolution implementation
- Stakeholder communication and status updates
- Resolution of complex and major incidents
- Code reviews (Per week 2 per individual) to ensure adherence to standards and optimize performance
- Bug fixing of recurring or critical issues identified during operations
- Gold layer tasks, including enhancements and performance tuning.
- Design, develop, and support data pipelines and solutions using Azure data engineering services.
- Implement data flow and ETL techniques leveraging Azure Data Factory, Databricks, and Synapse.
- Cleanse, transform, and enrich datasets using Databricks notebooks and PySpark.
- Orchestrate and automate workflows across services and systems.
- Collaborate with business and technical teams to deliver robust and scalable data solutions.
- Work in a support role to resolve incidents, handle change/service requests, and monitor performance.
- Contribute to CI/CD pipeline implementation using Azure DevOps.
Technical Requirements:
- 4 to 6 years of experience in IT and Azure data engineering technologies.
- Strong experience in Azure Databricks, Azure Synapse, and ADLS Gen2.
- Proficient in Python, PySpark, and SQL.
- Experience with file formats such as JSON and Parquet.
- Working knowledge of database systems, with a preference for Teradata and Snowflake.
- Hands-on experience with Azure DevOps and CI/CD pipeline deployments.
- Understanding of Data Warehousing concepts and data modeling best practices.
- Familiarity with SNOW (ServiceNow) for incident and change management.
Non-Technical Requirements:
- Ability to work independently and collaboratively in virtual teams across geographies.
- Strong analytical and problem-solving skills.
- Experience in Agile development practices, including estimation, testing, and deployment.
- Effective task and time management with the ability to prioritize under pressure.
- Clear communication and documentation skills for project updates and technical processes.
Technologies:
- Azure Data Factory
- Azure Databricks
- Azure Synapse Analytics
- PySpark / SQL
- Azure Data Lake Storage (ADLS), Blob Storage
- Azure DevOps (CI/CD pipelines)
Nice-to-Have:
- Experience with Business Intelligence tools, preferably Power BI
- DP-203 certification (Azure Data Engineer Associate)
NOTE -
Weekly rotational shifts -
11am to 8pm
2pm to 11pm
5pm to 2 am
P.S. - In any one weekend they should be available in call. If there is any issues alone they should work on that. there will be on call support monthly once.
Skills Required- Angular 2+, AngularJs, Javascript, HTML/CSS, jQuery, AJAX, Bootstrap
Additional skills(Good to have): Java sprint boot, rest api development, Sql/NoSql db knowledge
Tremendous opportunity to make impact on business and ad-tech industry.
Benefits:
Build shit that matters!!!
Experience the impact of your hard work
Work hard and party harder
Work with a extremely committed group of people
Explore and implement new technologies
Technical Skills Desired:
Who Should apply?
Only for Serious job seekers
Technically Strong Candidates who are willing to take up challenging roles and want to raise their Career graph.
Why Think n Solutions Software?
Exposure to latest Technology. Opportunity working in different platforms Direct client interaction
Rapid Career Growth
Friendly Knowledge sharing Environment
- Must have experience in either of JavaScript frameworks: Angular /ReactJS (Preferred)
- Must have experience in either of backend software development [J2EE, Spring Boot, Spring core, JPA] or Node.Js
- Must have experience in designing/implementing Hibernate/ORM, Restful web services, Micro Services using Java or Node Js
- Knowledge with relational/ NoSQL databases (e.g., Oracle, MySQL, MongoDB)
- Knowledge with versioning (Git or SVN) /build tools (Maven/Ant/Gradle/Junit), Unit testing & code coverage tools is desired
- Knowledge of App / Web servers (NGINX / Tomcat / Jboss Wildfly)
- Understands the process of new application development and has the ability to apply these concepts with minimal mentoring and supervision.
- Advanced knowledge experience using any of IDE (e.g., Eclipse, SonarQube, STS, VS Code)
- Knowledge on Cloud platform, technologies & deployments is an added advantage
- Resolves technical issues through debugging, research, and
- Must have experience with Agile tools
Functional Skills Desired:
- Good knowledge in product development domains
- Knowledge in Finance and Insurance domain preferred
- Maintain quality and ensure responsiveness of
- Complete application development by coordinating requirements, schedules, and activities contributing to team meetings;
- Troubleshooting development and production problems across multiple environments and operating platforms.
- Collaborate with the rest of the engineering team to design and launch new features
- Understanding and implementation of security and data
- Ensure designs are in compliance with specifications and standards and best industry
- Usage of Process tools – JIRA, TFS, HP QC or any other agile tools, also knowledge of CMM Level 3 development process.
- Self-motivated & working independently with minimal
- Maintain code integrity and
- Experience working with graphic designers and converting designs to visual
- Follow the coding standards for java/node and angular
Technical Skills Good to Have:
- Usage of troubleshooting tools like J Profiler, J Meter or application performance tuning
- Exposure to Non-web-based developments, for both mobile and
- Development experience using Docker, Kubernetes, Containerization etc in AWS or other cloud platforms
- Willing to take up proof of concepts and showcase the technical capability
- Usage of design tools Visio or draw io
- Usage of CI/CD pipeline
Informatica PowerCenter (9x ,10.2) : Minimum 2+ years experience
SQL / PLSQL: Understanding of SQL procedure. Able to convert procedures into Informatica mapping.
Good to have- Advantage if you have knowledge of Windows Batch Script.

Set the monthly, weekly, daily targets for the team and ensure that the targets are achieved
Maintain the sales report of the team
Needs to own hiring, firing and deliverables of at least 10 education counselors
Motivating & mentoring team to achieve & exceed targets
Design & develop business models
Conducting weekly reviews for performance & training
Identifying the areas of improvements & KPI’s
Involved in a lot of calling on phone for negotiation and objection handling









