
● Good knowledge of Dimensional data warehouse systems
● Reporting data models with SSIS and MS SQL Server
● Knowledge on all aspects of BI preferably data modeling, data
● integration, data analysis and reporting
● Good working knowledge of MS Azure environment
● Knowledge of BI tools like Tableau or PowerBI
● Good communication skills
Comes with the opportunity to upskill on Big Data cluster

Similar jobs
Role Overview:
We are seeking a highly skilled and experienced Lead Web App Developer - Backend to join our dynamic team in Bengaluru. The ideal candidate will have a strong background in backend development, microservices architecture, and cloud technologies, with a proven ability to deliver robust, scalable solutions. This role involves designing, implementing, and maintaining complex distributed systems, primarily in the Mortgage Finance domain.
Key Responsibilities:
- Cloud-Based Web Applications Development:
- Lead backend development efforts for cloud-based web applications.
- Work on diverse projects within the Mortgage Finance domain.
- Microservices Design & Development:
- Design and implement microservices-based architectures.
- Ensure scalability, availability, and reliability of distributed systems.
- Programming & API Development:
- Write efficient, reusable, and maintainable code in Python, Node.js, and Java.
- Develop and optimize RESTful APIs.
- Infrastructure Management:
- Leverage AWS platform infrastructure to build secure and scalable solutions.
- Utilize tools like Docker for containerization and deployment.
- Database Management:
- Work with RDBMS (MySQL) and NoSQL databases to design efficient schemas and optimize queries.
- Team Collaboration:
- Collaborate with cross-functional teams to ensure seamless integration and delivery of projects.
- Mentor junior developers and contribute to the overall skill development of the team.
Core Requirements:
- Experience: Minimum 10+ years in backend development, with at least 3+ years of experience in designing and delivering large-scale products on microservices architecture.
- Technical Skills:
- Programming Languages: Python, Node.js, Java.
- Frameworks & Tools: AWS (Lambda, RDS, etc.), Docker.
- Database Expertise: Proficiency in RDBMS (MySQL) and NoSQL databases.
- API Development: Hands-on experience in developing REST APIs.
- System Design: Strong understanding of distributed systems, scalability, and availability.
Additional Skills (Preferred):
- Experience with modern frontend frameworks like React.js or AngularJS.
- Strong design and architecture capabilities.
What We Offer:
- Opportunity to work on cutting-edge technologies in a collaborative environment.
- Competitive salary and benefits package.
- Flexible hybrid working model.
- Chance to contribute to impactful projects in the Mortgage Finance domain.
- Experience in Core Java 5.0 and above, CXF, Spring.
- Extensive experience in developing enterprise-scale n-tier applications for financial domain. Should possess good architectural knowledge and be aware of enterprise application design patterns.
- Should have the ability to analyze, design, develop and test complex, low-latency client-facing applications.
- Good Experience into Microservices , Data structures , Oops , Algorithms, multithreading etc
- Good development experience with RDBMS, preferably Sybase database.
- Good knowledge of multi-threading and high-volume server-side development.
- Experience in sales and trading platforms in investment banking/capital markets.
- Basic working knowledge of Unix/Linux.
- Experience into High /Low level designing.
- Excellent problem solving and coding skills in Java.
- Strong interpersonal, communication and analytical skills.
- Should have the ability to express their design ideas and thoughts.
● 5+ years of experience as a Data engineer or related role.
● 5+ years of experience in application development using Python
● Strong experience with SQL and good to have NoSQL.
● Experience with Agile engineering practices.
● Preferred experience in writing queries for RDBMS, cloud-based data warehousing solutions like
Snowflake.
● Ability to work independently or as part of a team.
● Experience with cloud platforms, preferably AWS, is good to have
● Experience with ETL/LT tools and methodologies.
● Experience working on real-time Data Streaming and Data Streaming platforms
Egnyte is seeking an experienced Sr. Software Engineer to join our Software Engineering (Infrastructure) group. The Software Engineering (Infrastructure) group builds large distributed components and services that run Egnyte’s Cloud Platform. Our code serves billions of requests per day with sub-second latency in a fault-tolerant environment. Some of the responsibilities for this group include Egnyte’s Cloud File System, Object Store, Metadata Stores, Search Systems, Recommendations Systems, Synchronization, and intelligent caching of multi-petabyte datasets. We are looking for candidates with a shared passion for building large-scale distributed systems and a keen sense for tackling complexities that come with scaling through multiple orders of magnitude.
What You’ll Do (but is not limited to)
- Design and develop highly-scalable elastic cloud architecture that seamlessly integrates with on-premises systems
- Challenge and redefine existing architectural fundamentals in order to provide next level of performance and scalability; ability to foresee post-deployment design challenges, performance and scale bottlenecks
- Work with multicultural, geographically distributed teams and closely coordinate with cross-functional teams in multiple time zones.
- Deliver enterprise-grade products to customers and continuously work with engineering team to refine products in the field
- Mentor interns and junior engineers, collaborate with Operations, and work closely with CTO on roadmap items
- Extensive penetration testing to ensure security across a hybrid deployment between public/private cloud
- Monitor and manage 3,000+ nodes using modern DevOps tools and APM solutions
- Proactive performance and exception analysis
Your Qualifications
- 5+ years of relevant industry work Experience
- Demonstrated success designing and developing complex systems
- Expertise with multi-tenant, highly complex, cloud solutions
- Experience owning all aspects of software engineering, from design to implementation, QA and maintenance.
- Experience with the following technologies: Java, SQL, Linux, Python, HBase/BigTable
- Data driven decision process
- Relies on unit testing instead of manual QA
- Knowledge of DevOps techniques
- BS or MS degree in Computer Science or related field
Bonus Skills
- Experience with Hybrid and/or on-premises solutions
- Experience in working with AWS or GCP
- Experience with the following technologies:, Nginx, Haproxy, BigQuery, New Relic, Graphite, and/or Puppet
- Security / Governance expertise
About Egnyte
In a content critical age, Egnyte fuels business growth by enabling content-rich business processes, while also providing organizations with visibility and control over their content assets. Egnyte’s cloud-native content services platform leverages the industry’s leading content intelligence engine to deliver a simple, secure, and vendor-neutral foundation for managing enterprise content across business applications and storage repositories. More than 16,000 customers trust Egnyte to enhance employee productivity, automate data management, and reduce file-sharing cost and complexity. Investors include Google Ventures, Kleiner Perkins, Caufield & Byers, and Goldman Sachs. For more information, visithttp://www.egnyte.com/"> http://www.egnyte.com/">www.egnyte.com
What You'll Do
You will be part of our data platform & data engineering team. As part of this agile team, you will work in our cloud native environment and perform following activities to support core product development and client specific projects:
- You will develop the core engineering frameworks for an advanced self-service data analytics product.
- You will work with multiple types of data storage technologies such as relational, blobs, key-value stores, document databases and streaming data sources.
- You will work with latest technologies for data federation with MPP (Massive Parallel Processing) capabilities
- Your work will entail backend architecture to enable data modeling, data queries and API development for both back-end and front-end data interfaces.
- You will support client specific data processing needs using SQL and Python/Pyspark
- You will integrate our product with other data products through Django APIs
- You will partner with other team members in understanding the functional / non-functional business requirements, and translate them into software development tasks
- You will follow the software development best practices in ensuring that the code architecture and quality of code written by you is of high standard, as expected from an enterprise software
- You will be a proactive contributor to team and project discussions
Who you are
- Strong education track record - Bachelors or an advanced degree in Computer Science or a related engineering discipline from Indian Institute of Technology or equivalent premium institute.
- 2-3 years of experience in data queries, data processing and data modeling
- Excellent ANSI SQL skills to handle complex queries
- Excellent Python and Django programming skills.
- Strong knowledge and experience in modern and distributed data stack components such as the Spark, Hive, Airflow, Kubernetes, Docker etc.
- Experience with cloud environments (AWS, Azure) and native cloud technologies for data storage and data processing
- Experience with relational SQL and NoSQL databases, including Postgres, Blobs, MongoDB etc.
- Familiarity with ML models is highly preferred
- Experience with Big Data processing and performance optimization
- Should know how to write modular, optimized and documented code.
- Should have good knowledge around error handling.
- Experience in version control systems such as GIT
- Strong problem solving and communication skills.
- Self-starter, continuous learner.
Good to have some exposure to
- Start-up experience is highly preferred
- Exposure to any Business Intelligence (BI) tools like Tableau, Dundas, Power BI etc.
- Agile software development methodologies.
- Working in multi-functional, multi-location teams
What You'll Love About Us – Do ask us about these!
- Be an integral part of the founding team. You will work directly with the founder
- Work Life Balance. You can't do a good job if your job is all you do!
- Prepare for the Future. Academy – we are all learners; we are all teachers!
- Diversity & Inclusion. HeForShe!
- Internal Mobility. Grow with us!
- Business knowledge of multiple sectors
● Good knowledge of Dimensional data warehouse systems
● Reporting data models with SSIS and MS SQL Server
● Knowledge of all aspects of BI preferably data modeling, data
● integration, data analysis, and reporting
● Good working knowledge of MS Azure environment
● Knowledge of BI tools like Tableau or PowerBI
● Good communication skills
Senior Software Engineer (Python)
Job description
Fulfil’s software engineers develop the next-generation technologies that change how millions of customer orders are fulfilled by merchants. Our products need to handle information at massive scale. We're looking for engineers who bring fresh ideas from all areas into our technology.
As a senior software engineer, you will work on our python based ORM and applications that scales to handle millions of transactions every hour. This is mission critical software and your primary focus will be building robust and scalable solutions that are easy to maintain.
In this role, you will be collaborating closely with the rest of the team working on different layers of infrastructure in an international environment. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product are important.
What You’ll Do:
- Own definition and implementation of API interfaces (REST and GraphQL). We take pride in our 100% open API with over 600 endpoints.
- Implement simple solutions to complex business logic that enables our merchants to manage financials, orders and shipments across millions of transactions.
- Build reusable components and packages for future use.
- Translate specs and user stories into reviewable, test covered patches.
- Peer review code and refactor existing code.
- Integrate with our eCommerce partners (Shopify, BigCommerce, Amazon), shipping partners (UPS, USPS, FedEx, DHL) and EDI.
- Manage Kubernetes and Docker based global deployment of our infrastructure.
Requirements
We’re Looking for Someone With:
- Experience working with ORMs like SQLAlchemy or Django
- Experience with SQL and databases (Postgres preferred)
- Experience in developing large server side applications and microservices
- Ability to create high quality code
- Experience with python testing tools (pytest) and test automation
- Familiarity with code versioning tools like GIT
- Strong sense of ownership and leadership quality
- Experienced in the tools of our web stack
- Python
- Celery
- Postgres
- Redis
- RabbitMQ
Nice to Haves:
- Prior experience at a growth stage Internet/Software company
- Experience with ReactJS, Google Cloud, Heroku
- Cloud deployment and scaling experience
About Us:
Fulfil.io helps high growth, high volume merchants simplify operations and scale for growth. With the rise in omni-channel commerce, Fulfil was founded with the simple idea that merchant operations need to be simplified in order to deliver amazing retail experiences. Fulfil enables businesses to turn their back office operations into an accelerator for growth by integrating order management, inventory management, warehouse management, vendor/supplier management, wholesale, manufacturing, financials and customer service, into one seamless solution. We believe merchants should love their operations platform, and we work hard to make that happen every single day. Fulfil.io is a trusted solution for brands like EndySleep, Mejuri, Lie-Nielson Toolworks, and many more.
Fulfil.io is a venture backed technology company with offices in San Francisco, Toronto, and Bangalore. The team is made up of people who want to feel challenged at work, be the best at their craft and learn from one another. We come from different backgrounds and experiences, all passionate about the work we do, the team we do it with, and the customers we do it for. Join us in our journey to simplify operations and empower merchants around the world!
In this role, you should be able to write functional code with a sharp eye for spotting defects. You should be a team player and an excellent communicator. If you are also passionate about the .NET framework and software design/architecture, we- d like to meet you.
Your goal will be to work with internal teams to design, develop and maintain software.
Responsibilities and Duties :
- Writing clean, scalable code using .NET programming languages.
- Build high-quality scalable and predictable web applications on the .NET Technology stack and maintain internal and external facing web applications.
- Creating and integrating services and APIs using Web API 2 for various products and applications.
- Working on system architecture and databasing in MongoDB, MySQL, and NoSQL.
- Make regular modifications to existing software for error correction, adaptation to new hardware, and improving overall function and performance
- Evaluate new code for reliable architecture, stability, reusability, performance, automation, security, and metrics
- Using JSON to store and transport data.
- Working with other team members and the team lead using Project Management Tools and Version Source Control to create industry-leading technological products.
Skills required:
- 1 - 3 years of experience in MongoDB.
- Knowledge about AWS Cloud is an added advantage.
- Knowledge of .NET development and lifecycle methods in C# Language.
- Experience in making Web API 2 including routes, class components, async methods, parallel programming, authentication, and authorization, etc.
- Experience and hands-on knowledge of data management methods with MongoDB, MySQL, or NoSQL, etc.
- Should have experience and understanding of using JSON.
- Familiarity with working with .NET Framework, JavaScript, HTML. Knowledge of .NET Core is preferred.
- Knowledge and experience working with AWS Cloud and cloud computing techniques is an added advantage, but not a requirement.
- Having a BSc Degree in Computer Science is optional but should have some educational knowledge of computer science.
- Experience in Web and Mobile Applications.
- Agility and ability to adapt quickly to changing requirements and scope and priorities
- Strong proficiency with JavaScript (ECMAScript 5, 6)
- Knowledge of Node.js and frameworks available for it (SailsJS, Express)
- Understanding the nature of asynchronous programming and its quirks and workarounds.Familiarity with front-end technologies.
- User authentication and authorization between multiple systems, servers, and environments.
- Interaction with multiple data sources.
- Good understanding of SQL syntax.
- Understanding fundamental design principles behind a scalable application.
- Understanding differences between multiple delivery platforms, such as mobile vs. desktop, and optimizing output to match the specific platform.
- Implementing automated testing platforms and unit tests.
- Proficient understanding of code versioning tools, such as Git.
- Knowledge in the field of IOT is good to have.
• Build data pipelines for structured/unstructured, real-time/batch, events/synchronous/asynchronous using MQ, Kafka, Steam processing using Java / Python
• Design the Data stores for Big Data systems with expertise in Cassandra, HBase
• Implementation of Indexing and Search using Elasticsearch
• Setup and Deployment of Cassandra, Elasticsearch clusters
Required Qualifications and Competencies:
• Strong hands-on experience with Cassandra, data modeling, data replication, clustering, indexing for handling for large data sets
• Experience with SQL, NoSQL, relational database design, and methods for efficiently retrieving data for Time Series Analytics
• Strong understanding of CQL, Data Modeling in-order to achieve highly performant data access
• Strong experience in data modeling in Cassandra to design efficient storage model to meet variety of business needs
• Should have Elasticsearch skill with significant experience working with large Elasticsearch clusters, cluster performance optimisation, capacity planning, enhancing monitoring capabilities for early issue detection, driving operational readiness and ongoing maintenance
• Strong hands-on experience of programming with Java / Python
• Ability to troubleshoot and investigate stability, performance issues









