
Dot Net Implementation Engineer (Client Server)|
Job description
We're Hiring!!
PLEASE NOTE- This Job is not for .NET Backend Developers. Only Candidates who have prior experience with Software Deployemet across Client Server can Apply.
TechnoRishi Consulting Pvt. Ltd. is seeking a skilled and experienced Implementation Engineer to join our Technical team for the Bangalore location.
As an Implementation Engineer, you will be responsible for deploying and configuring our software products for clients, ensuring smooth integration and functionality and will work closely with our clients to understand their requirements, provide technical expertise, and troubleshoot any issues that arise during the implementation process.
The ideal candidate will have excellent problem-solving skills, attention to detail, and the ability to work collaboratively in a fast-paced environment. Join our dynamic team at TechnoRishi Consulting and play a key role in delivering innovative solutions to our clients in the ever-evolving technology landscape.
If this sounds interesting and exciting, we'd love for you to apply.
JOB ROLE: Implementation Engineer (Cloud/ Client Server)
JOB TYPES: Full-time, Regular / Permanent
CTC: 3-5 LPA
SCHEDULE
- Day shift
- Monday to Friday
- Weekend availability as and when required
EXPERIENCE: 1-2 years of experience
INDUSTRY: IT Services / Technology
ROLES & RESPONSIBILITIES
- Deploy and configure our software solutions on client systems, adhering to project timelines and specifications.
- Deploying .NET web application, IIS server (Installation & Querying Knowledge), Active directory configuration ( LDAP, Azure).
- Conduct thorough requirements gathering sessions with clients to understand their specific needs and technical environments.
- Provide timely and effective technical support to clients, resolving issues related to installation, configuration, and application functionality.
- Diagnose and troubleshoot complex technical problems, utilizing logs, debugging tools, and system monitoring.
- Handle client tickets raised by them and give solutions to them within SLA.
- Create and maintain comprehensive documentation, including installation guides, user manuals, and troubleshooting procedures.
- Maintain clear and consistent communication with clients throughout the implementation process.
- Provide regular updates on implementation progress and identify potential risks.
- Document all integrations done and review use cases with Product Manager and Solutions Architect for feedback.
- Work with the Solutions Architect to Identify integration points, configuration requirements.
- Configure Operating Systems, Network configurations, and firewall rules as per project requirements.
SKILLSET REQUIREMENT
- DNS, IP, SSL, Hosting Knowledge
- Strong skills in server and client perspective.
- Sound in Windows application, services, and console applications
- Experience in deploying .NET web application, IIS server, SQL server, Active directory, and domain knowledge.
- UG/PG in Computer Science Engineering
- Strong project management and organizational skills.
- Excellent written & verbal communication and interpersonal skills to work collaboratively with clients and internal teams.
- Excellent problem-solving skills and the ability to troubleshoot technical issues.

Similar jobs
Strong Snowflake Data Architect profile (Cloud Data Platform / AI-led Data Transformation)
Mandatory (Experience 1) – Must have 8+ years of experience in Data Engineering / Data Architecture, with strong focus on building enterprise-scale data platforms
Mandatory (Experience 2) – Must have 3+ years of deep hands-on experience in Snowflake architecture, including designing and implementing scalable data warehouse solutions
Mandatory (Experience 3) – Strong expertise in Snowflake features including Resource Monitors, RBAC, Virtual Warehouses, Time Travel, Zero Copy Clone, and query performance optimization
Mandatory (Experience 4) – Proven experience building and managing data ingestion pipelines using Snowpipe, handling structured, semi-structured (JSON, XML), and columnar data formats (Parquet)
Mandatory (Experience 5) – Strong experience in cloud ecosystem, preferably AWS, including S3, Lambda, EC2, Redshift, and integration with Snowflake-based architectures
Mandatory (Experience 6) – Proven experience in migrating data from on-premise or legacy systems to Snowflake, including data modeling, transformation, and validation
Mandatory (Experience 7) – Hands-on experience in SQL, SnowSQL, Python, or PySpark for data transformation, automation, and monitoring
Mandatory (Experience 8) – Experience in data modeling, partitioning, micro-partitions, and re-clustering strategies in Snowflake
Mandatory (Experience 9) – Must have experience working in client-facing or consulting roles, including requirement gathering, solution design, and stakeholder communication
Mandatory (Skill 1) – Strong understanding of end-to-end data architecture including ETL/ELT pipelines, data lakes, and warehouse integration
Mandatory (Skill 2) – Experience in designing monitoring and automation frameworks using Python, Bash, or similar tools
Mandatory (Skill 3) – Ability to translate business requirements into scalable technical solutions and define future-state data architecture roadmaps
Mandatory (Note) – Only immediate joiners or candidates who can join within 15 days
Objectives of this role
- Develop, test and maintain high-quality software using Python programming language.
- Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions.
- Collaborate with cross-functional teams to identify and solve complex problems.
- Write clean and reusable code that can be easily maintained and scaled.
Your tasks
- Create large-scale data processing pipelines to help developers build and train novel machine learning algorithms.
- Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions.
- Debugging codes when required and troubleshooting any Python-related queries.
- Keep up to date with emerging trends and technologies in Python development.
Required skills and qualifications
- 3+ years of experience as a Python Developer with a strong portfolio of projects.
- Hands on exp in angular 12+v
- Bachelor's degree in Computer Science, Software Engineering or a related field.
- In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch.
- Experience with front-end development using HTML, CSS, and JavaScript.
- Familiarity with database technologies such as SQL and NoSQL.
- Excellent problem-solving ability with solid communication and collaboration skills.
Preferred skills and qualifications
- Experience with popular Python frameworks such as Django, Flask or Pyramid.
- Knowledge of Angular
- A working understanding of cloud platforms such as AWS, Google Cloud or Azure.
- Contributions to open-source Python projects or active involvement in the Python community.
Role: Azure Fabric Data Engineer
Experience: 5–10 Years
Location: Pune/Bangalore
Employment Type: Full-Time
About the Role
We are looking for an experienced Azure Data Engineer with strong expertise in Microsoft Fabric and Power BI to build scalable data pipelines, Lakehouse architectures, and enterprise analytics solutions on the Azure cloud.
Key Responsibilities
- Design & build data pipelines using Microsoft Fabric (Pipelines, Dataflows Gen2, Notebooks).
- Develop and optimize Lakehouse / Data Lake / Delta Lake architectures.
- Build ETL/ELT workflows using Fabric, Azure Data Factory, or Synapse.
- Create and optimize Power BI datasets, data models, and DAX calculations.
- Implement semantic models, incremental refresh, and Direct Lake/DirectQuery.
- Work with Azure services: ADLS Gen2, Azure SQL, Synapse, Event Hub, Functions, Databricks.
- Build dimensional models (Star/Snowflake) and support BI teams.
- Ensure data governance & security using Purview, RBAC, and AAD.
Required Skills
- Strong hands-on experience with Microsoft Fabric (Lakehouse, Pipelines, Dataflows, Notebooks).
- Expertise in Power BI (DAX, modeling, Dataflows, optimized datasets).
- Deep knowledge of Azure Data Engineering stack (ADF, ADLS, Synapse, SQL).
- Strong SQL, Python/PySpark skills.
- Experience in Delta Lake, Medallion architecture, and data quality frameworks.
Nice to Have
- Azure Certifications (DP-203, PL-300, Fabric Analytics Engineer).
- Experience with CI/CD (Azure DevOps/GitHub).
- Databricks experience (preferred).
Note: One Technical round is mandatory to be taken F2F from either Pune or Bangalore office
Job Title: Senior Full-Stack Developer (Tech Lead)
Experience: 5–8 Years
Location: Ahmedabad (Hybrid / On-site preferred)
Salary: Flexible for the right candidate
Note: This role is strictly for Ahmedabad-based (local) candidates. On-site presence is mandatory.
Role Overview
We are seeking a highly skilled Senior Full-Stack Developer (Tech Lead) to lead technical delivery and oversee end-to-end system execution. The ideal candidate will take ownership of architecture, ensure high-quality engineering standards, mentor junior team members, and effectively communicate with clients and stakeholders.
This role is best suited for a hands-on professional who enjoys solving complex problems, building scalable systems, and taking full responsibility for technical outcomes.
Key Responsibilities
- Design, develop, and maintain scalable backend and frontend systems
- Own system architecture and technical decision-making
- Lead code reviews and enforce clean, modular, and maintainable code practices
- Collaborate with clients to understand requirements and provide technical solutions
- Mentor and guide junior developers to improve overall team performance
- Ensure reliability, performance, and security of applications
- Drive best practices across development, deployment, and CI/CD workflows
- Design and integrate Generative AI–powered features where applicable (e.g., chatbots, content generation, automation tools)
Required Technical Skills
Backend (Must-Have)
- Strong experience with Node.js (Express.js / NestJS)
- RESTful API design and implementation
- Database design, optimization, and performance tuning
- Experience with PostgreSQL / MySQL
- Hands-on experience with MongoDB or other NoSQL databases
- Authentication and authorization mechanisms (JWT, OAuth2, RBAC)
- Willingness to learn Python and its frameworks (Django / FastAPI)
- Basic Python knowledge is an added advantage
- Experience with React.js or Next.js
- Strong knowledge of JavaScript / TypeScript
- Component-driven architecture and reusable UI patterns
- State management using Redux / Zustand / Context API
- Responsive UI development using MUI, Ant Design, or Tailwind CSS
Engineering Practices
- Proficient with Git, GitHub/GitLab
- Understanding of CI/CD pipelines
- Experience with Docker and containerization
- Familiarity with clean architecture and modular design patterns
Bonus Skills (Nice to Have)
- Microservices architecture
- Experience with Prisma ORM
- DevOps exposure (AWS, EC2, Vercel, Docker, Nginx)
- Caching solutions such as Redis
- Queue systems (Celery, Kafka, RabbitMQ)
- Exposure to AI / LLM integrations
Soft Skills
- Strong sense of ownership and accountability
- Excellent English communication skills (verbal and written)
- Proven ability to mentor and lead junior developers
- Strong analytical and problem-solving mindset
- Reliable, consistent, and delivery-focused
- Leadership maturity and professionalism
Lightning Job by Cutshort ⚡
As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)
Samagra X - Tech Architect
As Samagra expands into Digital Public Goods (DPGs), we’re looking for a Tech Architect to help us shape how open-source products and platforms can transform governments' worldwide function. As Samagra, we pride ourselves on being thoughtful and intentional about the impact that can be created through well-built products and platforms.
In this role, you’ll work with a mission-driven team on some of the world’s most complex governance problems, where technology will play a pivotal role. You’ll design the architecture of platforms, answering
questions like-
- How to deliver quality advisory services to 100mn farmers nationwide?
- How can standardised assessment/practice tools be enabled to ensure FLN1 in children?
- How to build the capacity of paramedical staff in the country?
- How can govt administrators in any domain monitor the progress of flagship programs?
- How to build an open commerce network to connect farmers to the best service providers?
Working at Samagra, you’ll be challenged and rewarded with impact at scale. We value excellence, collaboration, empathy, and high ownership mindset. The most successful candidates will exhibit work that reflects these values. We want you to enable every team member to do their best work. We expect you to have a solid technical background, excellent communication skills, a flare for open-source products/platforms and a commitment to making everyone feel included.
Your Impact
In this role, you’ll have tremendous opportunities to learn, collaborate and impact millions of citizens in India and abroad by transforming how governments function. As a Tech Architect, you will:
- Take Ownership of technical vision for a given DPG - Define standards, drive technical architecture, strategy, and roadmap with the DPG working groups
- Collaborate with the DPG development partner to translate technical vision into a deliverable roadmap and product design. Pitch in with code, code reviews, architectural reviews and technical mentoring
- Work closely with the DPG development partner at various milestones of the product delivery to ensure a high-quality product is delivered and the product meets all the requirements, including debuggability,
- supportability, availability, scalability, cost-efficiency and performance.
- Continuously develop and improve the interface between technology and the DPG community. Ensure
- that overall technical practices are scalable, constantly evaluating tech stacks to cater to evolving business needs.
- Participate in building ecosystems around DPG with the community - build platforms, plugins & apps.
- Ensure all internal processes & external services comply with security, privacy policies and regulations.
- Stay on top of digital trends, principles and paradigms and be able to relate how these trends will affect the DPG community.
- Engage with internal and external collaborators, and maintain relationships/partnerships with internal team members and adopters to develop strategies, goals and objectives consistent with the DPG
- strategy.
Skills required
- Solid technical skills - Strong experience in having architected, built, operated and scaled distributed, large-scale, fault-tolerant systems; Worked on defining problem statements at all levels - RFCs to good first issues. Experience building and cultivating strong engineering practices; Experience with enterprise software development built and delivered on both on-premise and Cloud is preferred; Ability to diagnose technical problems, debug code, and automate tasks; Strong problem-solving and analytical skills.
- Leadership - Mix of intelligence, initiative, integrity, domain knowledge, verbal agility, and tactful stakeholder engagement, which allows you to rapidly earn the trust of astute teams and individuals across the company/community. You have a strong bias for action and being resourceful
- Collaboration and communication - with the geo-distributed teams in a fast-paced, entrepreneurial environment. Also, able to communicate across functional teams, keeping various engineering, product and business stakeholders informed.
- Planning & execution - Able to own and deliver large projects end to end, keeping track of timelines; keep track of (and context switch between) different threads and ensure that details don’t slip through the cracks
Qualifications
- B. tech/BE(CS/IT) & from Tier 1 or Tier 2 institutes of India/Abroad or equivalent; M. tech/Ph.D. preferred
- 8+ years of software development experience with 4+ years contributing to software architecture
- Experience in architecting, designing, developing, testing, and deploying large-scale, distributed systems.
- Experience in contributing and actively participating in multiple open source projects.
- Strong leadership skills and experience in working with a developer ecosystem.
- Excellent communication and collaboration skills to work effectively with internal teams and external stakeholders.
- Automate the development and test automation processes through CI/CD pipeline (Git, Jenkins, Artifactory, Docker containers).
- Experience in one of the following programming languages: Python/Javascript/Typescript/Java.
- Experience with RDBMS (Postgres, MariaDB) and any other large scale distributed database systems.
- Have expertise in one or more areas like building and scaling web apps, Machine Learning, UI Engineering, Information Visualization, etc.
About Samagra
Samagra is a mission-driven governance consulting firm. We co-work with the bureaucratic leadership across states and the Centre on long-term systemic transformation programmes cutting across domains. Since 2012, when Samagra was founded, we have worked with the Government of India and 7 state governments on 15+ large-scale systemic transformation projects in sectors like education, agriculture, skills, employment, and public service delivery. Over the last 5 years, we have also built strong GovTech and DataGov capabilities, now housed under the SamagraX umbrella. Our mission is to improve the quality of life of citizens through better governance. And this mission is
what fuels every goal we chase. Want to know more about Samagra?
Check out https://www.youtube.com/watch?v=GPcPdI2SHwU&t=4s&ab_channel=Samagra-TransformingGovernance" target="_blank">this masterclass by our Founder and CEO, Gaurav Goel
Our Tech Leadership
Rahul Kulkarni - Chief Technologist
Nitin Kashyap - Senior Vice President - Product
Utkarsh Vijay - Vice President
Sukhpreet Sekhon - Vice President
Suresh Unnikrishnan - Vice President - Engineering
Product Examples
https://www.samagragovernance.in/blog/2023-03-31-leveraging-artificial-intelligence-to-deliver-advisory-to-farmers/" target="_blank">Ama KrushAI
https://www.youtube.com/watch?v=7CEo_R8XIhI" target="_blank">Nipun Lakshya App
https://www.youtube.com/watch?v=kR8m4VS8kqA" target="_blank">e-SAMWAD
https://www.samagragovernance.in/blog/2023-01-16-leveraging-data-science-algorithms-to-improve-data-quality-in-government/" target="_blank">Krushak Odisha database - DataScience
https://www.samagragovernance.in/blog/2023-01-16-leveraging-data-science-algorithms-to-improve-data-quality-in-government/" target="_blank">Sunbird cQube
https://saral.sunbird.org/" target="_blank">Sunbird SARAL
Responsibilities
- Will work with a team of test engineers to ensure the highest quality product delivery and define measurable metrics to gauge progress against objective QA goals.
- Set and drive expectations around quality for major releases grounded in solid customer impact and product understanding
- Understand how all elements of the system software ecosystem work together and develop QA approaches that fit the overall strategy
- Be responsible for development of test strategies and creation of appropriate test harnesses
- Oversee the development and execution of test plans and monitor and report on test execution
- Be a trusted partner for senior management to determine best solutions, help drive alignment and implement decisions throughout your team.
- Generate and provide quality metrics for your area/application
- Collaborate with onsite managers on feature sprint planning and provide metrics on testing progress.
- Consistently maintain transparency with the work and identify potential risks during releases.
Basic Qualifications
- Bachelor’s degree in computer science, computer engineering or equivalent.
- 10+ years of industry experience.
- 5+ years of experience in hands-on testing and 3 years of experience as a manager.
- Experience managing senior test engineers
- Experience in Python
- Experience in automation testing
- Experience in managing resources and area ownership within a distributed and adaptable model.
- Strong knowledge in automated testing methods and technologies required, preferably with (Selenium, REST Assured, SoapUI)
- A solid engineering foundation indicated by a demonstrated understanding of product design, life cycle, software development practices, and support services.
- Experience with standard test, defect, and automation management tools such as HP ALM, JIRA, and Jenkins.
- As the team needs to develop continuous integration of the framework we are looking for someone with CI/CD Infrastructure experience.
- Proven track record to lead a team efficiently when working with tight deadlines across multiple projects while maintaining a balanced work environment.
- Experience coordinating teams across multiple sites and time zones
- Experience in delivering large releases to the customer through direct and partner teams.
Preferred Qualifications
- More than 3 years’ experience with scripting languages, such as Python.
- Experience with test automation tools and frameworks such as PyTest, Robot, and Postman.
- Experience with cloud environments such as GCP, AWS, and Azure.
- Experience with Kubernetes
- Knowledge on testing open source applications
- Domain knowledge in Data Engineering.
- Strong organizational skills, ability to track multiple test executions.simultaneously and to be able to synthesize the results
- Experience in a SaaS environment that has an agile development process is a huge plus
- Strong people management skills with a proven ability to hire and grow talented programmatic and user-level personnel
- Experience working closely with development and business teams to communicate problem impacts and to understand business requirements
- Experience in agile development methodologies with continuous integration
- Experience with Android framework and internals
- Experience with AOSP (Android Open Source Project)
- Port AOSP to new architecture
- Port AOSP to new platform
- Experience with Linux Kernel drivers
- Port AOSP to new device / chipset
- Good understanding of HAL and device drivers
- Experience with Android SDK, NDK, JNI and IPC/Binder mechanisms
https://www.linkedin.com/feed/hashtag/?keywords=devicedrivers&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#DeviceDrivers https://www.linkedin.com/feed/hashtag/?keywords=embeddedsoftware&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#embeddedsoftware https://www.linkedin.com/feed/hashtag/?keywords=embeddedjobs&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#embeddedjobs https://www.linkedin.com/feed/hashtag/?keywords=portaosp&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#PortAOSP https://www.linkedin.com/feed/hashtag/?keywords=linuxkerneldrivers&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#LinuxKerneldrivers https://www.linkedin.com/feed/hashtag/?keywords=arm&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#ARM https://www.linkedin.com/feed/hashtag/?keywords=opensource&highlightedUpdateUrns=urn%3Ali%3Aactivity%3A6858330716606930944">#opensource
About the Role
The Dremio India team owns the DataLake Engine along with Cloud Infrastructure and services that power it. With focus on next generation data analytics supporting modern table formats like Iceberg, Deltalake, and open source initiatives such as Apache Arrow, Project Nessie and hybrid-cloud infrastructure, this team provides various opportunities to learn, deliver, and grow in career. We are looking for technical leaders with passion and experience in architecting and delivering high-quality distributed systems at massive scale.
Responsibilities & ownership
- Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
- Lead and mentor others about concurrency, parallelization to deliver scalability, performance and resource optimization in a multithreaded and distributed environment
- Propose and promote strategic company-wide tech investments taking care of business goals, customer requirements, and industry standards
- Lead the team to solve complex, unknown and ambiguous problems, and customer issues cutting across team and module boundaries with technical expertise, and influence others
- Review and influence designs of other team members
- Design and deliver architectures that run optimally on public clouds like GCP, AWS, and Azure
- Partner with other leaders to nurture innovation and engineering excellence in the team
- Drive priorities with others to facilitate timely accomplishments of business objectives
- Perform RCA of customer issues and drive investments to avoid similar issues in future
- Collaborate with Product Management, Support, and field teams to ensure that customers are successful with Dremio
- Proactively suggest learning opportunities about new technology and skills, and be a role model for constant learning and growth
Requirements
- B.S./M.S/Equivalent in Computer Science or a related technical field or equivalent experience
- Fluency in Java/C++ with 15+ years of experience developing production-level software
- Strong foundation in data structures, algorithms, multi-threaded and asynchronous programming models and their use in developing distributed and scalable systems
- 8+ years experience in developing complex and scalable distributed systems and delivering, deploying, and managing microservices successfully
- Subject Matter Expert in one or more of query processing or optimization, distributed systems, concurrency, micro service based architectures, data replication, networking, storage systems
- Experience in taking company-wide initiatives, convincing stakeholders, and delivering them
- Expert in solving complex, unknown and ambiguous problems spanning across teams and taking initiative in planning and delivering them with high quality
- Ability to anticipate and propose plan/design changes based on changing requirements
- Passion for quality, zero downtime upgrades, availability, resiliency, and uptime of the platform
- Passion for learning and delivering using latest technologies
- Hands-on experience of working projects on AWS, Azure, and GCP
- Experience with containers and Kubernetes for orchestration and container management in private and public clouds (AWS, Azure, and GCP)
- Understanding of distributed file systems such as S3, ADLS or HDFS
- Excellent communication skills and affinity for collaboration and teamwork













