
Job Details
Description
We are looking for a passionate Senior User Experience Designer to join our fast growing team at LogiNext. You’ll collaborate with Product Managers and other UI designers to design new features, translate concepts into living and breathing prototypes, and rapidly iterate on interactions, animations, and details to deliver the best user experience. You will be responsible for conceptualizing & creating highly interactive user interfaces across all supported platforms.
You are passionate about design and are driven to create solutions for our clients that combine business objectives, design principles, and continuous user research into a compelling user experience. You have experience in designing a consistent and adaptable platform-based approach to user experience within a technology-driven and Software-as-a-Service (SaaS) based environment. You are a compulsive problem-solver with excellent visualization and communication skills.
Responsibilities:
- Gather, analyse & process data to determine the user needs & enhance the user experience
- Research & learn new, or re- emerging front- end technologies that facilitate improved user experience.
- Conceptualise designs through storyboards & concept presentations
- Design beautiful digital interfaces and present ideas in a compelling manner
- Perform usability testing with prototypes
- Create wireframes & prototypes to create a perfect user interface
- Create the best visual communication design with graphic design, website design, drawing and illustration, animation, client discussion and standard layouts.
- Mentor other team members in conceptualising and visually articulating design solutions
Requirements:
- Bachelor’s degree or higher in Arts/Commerce/Science/Technology
- 8 to 10 years of experience in UI/UX designing
- Must have experience in designing user interfaces for B2B SaaS Products
- Must have excellent experience using tools such as Sketch, Photoshop, Adobe Illustrator
- Able to work and thrive in a fast-paced, rapidly changing work environment
- Should have excellent communication skills, with an ability to convey ideas clearly & precisely
- Must have leadership skills & a good team player
- Must be pro-active & a self-starter

About LogiNext
About
LogiNext is amongst the fastest growing tech company, providing solutions to simplify and automate the ecosphere of logistics and supply chain management. Our aim is to organize the daunting process of logistics and supply chain planning, with an array of SaaS driven by the most robust enterprise solutions globally.
Our clientele is spread across the globe and we empower them to optimize their supply chain operations by unique data capturing, advanced analytics and visualization. From inception, LogiNext has been an industry leader and recipient of awards like NetApp's Innovative Tech Company of the year, Entrepreneur's Logistics Firm of the Year, Aegis's innovation in Big Data, CIO Choice Award for best supply chain logistics cloud solutions, etc.
Backed by influential industry leaders like PayTM and Indian Angel Network and with partners like IBM, Microsoft, Google, AWS and Samsung, LogiNext has achieved exponential success in a very short span of time and is set to exceed 300% growth by the end of 2016. The true growth hackers, who paved way for this success are the people working exceptionally hard and adding value to our organisation. Our brand ambassadors - that's how we address our people, bring unique values, discipline and problem-solving skills to nurture the innovative and entrepreneurial work culture at LogiNext. Passion, versatility, expertise and a hunger for success is the Mantra chanted by every Logi-Nexter!
Company video


Connect with the team
Similar jobs
About Us: As India's fastest-growing D2C brand, we are at the forefront of innovation and transformation in the market. We’re a well-funded, rapidly growing (we have recently launched our 100th store), omnichannel D2C brand with a passionate and innovative team.
Job Summary: We are seeking a Data Engineer to help us design, build and maintain our BigQuery data warehouse by performing ETL operations and creating unified data models. You will work across various data sources to create a cohesive data infrastructure that supports our omnichannel D2C strategy.
Why Join Us: Experience the exciting world of India's billion-dollar D2C market. As a well-funded, rapidly growing omnichannel D2C brand, we are committed to changing the way India sleeps and sits. You'll have the opportunity to work with a passionate and innovative team and make a real impact on our success.
Key Responsibilities:
ETL Operations: Design, implement, and manage ETL processes to extract, transform, and load data from various sources into BigQuery.
Data Warehousing: Build and maintain a robust data warehouse in BigQuery, ensuring data integrity, security, and performance.
Data Modeling: Create and manage flat, unified data models using SQL and DBT to support business analytics and reporting needs.
Performance Optimization: Optimize ETL processes and data models to ensure timely data delivery for reporting and analytics.
Collaboration: Work closely with data analysts, product managers, and other stakeholders to understand data requirements and deliver actionable.
Documentation: Maintain comprehensive documentation of data workflows, ETL processes, and data models for reference and onboarding
Troubleshoot: Monitor and troubleshoot data pipeline issues, ensuring timely resolution to minimize disruption to business operations.
Skills and Qualifications:
- Proficiency in SQL and experience with BigQuery
- Minimum 2 years of experience in data engineering or a similar role
- Experience with data pipeline and ETL tools (e.g., Apache Airflow, Talend, AWS Glue)
- Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) and their data services
- Experience with data warehousing solutions (e.g., Amazon Redshift, Google BigQuery, Snowflake)
- Knowledge of data modeling, data architecture, and data governance best practices.
- Excellent problem-solving skills and attention to detail
- Knowledge of DBT (Data Build Tool) for data transformation
- Self-motivated, proactive, and highly accountable
- Excellent communication skills to effectively convey technical concepts and solutions
Bonus point - Prior experience in E-commerce or D2C space
About the Role:
We are seeking a skilled Data Engineer to join our growing AdTech team. In this role, you will design, build, and maintain high-performance ETL pipelines and large-scale data processing systems. You will work with massive datasets and distributed frameworks to power Adsremedy’s data-driven advertising solutions across Programmatic, In-App, CTV, and DOOH platforms.
What You’ll Do:
- Design, develop, and maintain scalable ETL pipelines on self-managed infrastructure
- Process and optimize large-scale datasets (terabytes of data) with high reliability and performance
- Build robust data processing workflows using Apache Spark (preferred) and/or Apache Flink
- Integrate, clean, and transform data from multiple internal and external sources
- Partner closely with data scientists, analysts, and business stakeholders to enable actionable insights
- Monitor, troubleshoot, and optimize data pipelines for operational excellence
- Ensure data quality, consistency, and performance across all data workflows
- Participate in code reviews and uphold best practices in data engineering
- Collaborate with QA teams to deliver production-ready, reliable systems
- Mentor junior engineers and promote knowledge sharing within the team
- Stay current with emerging data engineering tools, frameworks, and industry trends
What You’ll Need:
- 2+ years of experience building ETL pipelines using Apache Spark and/or Apache Flink
- Hands-on experience with big data caching solutions such as ScyllaDB, Aerospike, or similar
- Strong understanding of data lake architectures and tools like Delta Lake
- Proven experience handling terabytes of data in distributed environments
- Proficiency in Scala, Python, or Java
- Experience working with cloud data platforms (AWS S3, Azure Data Lake, Google BigQuery)
- Strong knowledge of SQL, data modeling, and data warehousing concepts
- Familiarity with Git and CI/CD workflows
- Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment
Nice to Have
- Experience with Apache Kafka for real-time data streaming
- Familiarity with Apache Airflow or similar orchestration tools
Job Description for Data Engineer Role:-
Must have:
Experience working with Programming languages. Solid foundational and conceptual knowledge is expected.
Experience working with Databases and SQL optimizations
Experience as a team lead or a tech lead and is able to independently drive tech decisions and execution and motivate the team in ambiguous problem spaces.
Problem Solving, Judgement and Strategic decisioning skills to be able to drive the team forward.
Role and Responsibilities:
- Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, mentoring other members of the engineering community, and from time to time, be asked to code or evaluate code
- Collaborate with digital product managers, and leaders from other team to refine the strategic needs of the project
- Utilize programming languages like Java, Python, SQL, Node, Go, and Scala, Open Source RDBMS and NoSQL databases
- Defining best practices for data validation and automating as much as possible; aligning with the enterprise standards
Qualifications -
- Experience with SQL and NoSQL databases.
- Experience with cloud platforms, preferably AWS.
- Strong experience with data warehousing and data lake technologies (Snowflake)
- Expertise in data modelling
- Experience with ETL/LT tools and methodologies
- Experience working on real-time Data Streaming and Data Streaming platforms
- 2+ years of experience in at least one of the following: Java, Scala, Python, Go, or Node.js
- 2+ years working with SQL and NoSQL databases, data modeling and data management
- 2+ years of experience with AWS, GCP, Azure, or another cloud service.
Senior Data Engineer
Location: Bangalore, Gurugram (Hybrid)
Experience: 4-8 Years
Type: Full Time | Permanent
Job Summary:
We are looking for a results-driven Senior Data Engineer to join our engineering team. The ideal candidate will have hands-on expertise in data pipeline development, cloud infrastructure, and BI support, with a strong command of modern data stacks. You’ll be responsible for building scalable ETL/ELT workflows, managing data lakes and marts, and enabling seamless data delivery to analytics and business intelligence teams.
This role requires deep technical know-how in PostgreSQL, Python scripting, Apache Airflow, AWS or other cloud environments, and a working knowledge of modern data and BI tools.
Key Responsibilities:
PostgreSQL & Data Modeling
· Design and optimize complex SQL queries, stored procedures, and indexes
· Perform performance tuning and query plan analysis
· Contribute to schema design and data normalization
Data Migration & Transformation
· Migrate data from multiple sources to cloud or ODS platforms
· Design schema mapping and implement transformation logic
· Ensure consistency, integrity, and accuracy in migrated data
Python Scripting for Data Engineering
· Build automation scripts for data ingestion, cleansing, and transformation
· Handle file formats (JSON, CSV, XML), REST APIs, cloud SDKs (e.g., Boto3)
· Maintain reusable script modules for operational pipelines
Data Orchestration with Apache Airflow
· Develop and manage DAGs for batch/stream workflows
· Implement retries, task dependencies, notifications, and failure handling
· Integrate Airflow with cloud services, data lakes, and data warehouses
Cloud Platforms (AWS / Azure / GCP)
· Manage data storage (S3, GCS, Blob), compute services, and data pipelines
· Set up permissions, IAM roles, encryption, and logging for security
· Monitor and optimize cost and performance of cloud-based data operations
Data Marts & Analytics Layer
· Design and manage data marts using dimensional models
· Build star/snowflake schemas to support BI and self-serve analytics
· Enable incremental load strategies and partitioning
Modern Data Stack Integration
· Work with tools like DBT, Fivetran, Redshift, Snowflake, BigQuery, or Kafka
· Support modular pipeline design and metadata-driven frameworks
· Ensure high availability and scalability of the stack
BI & Reporting Tools (Power BI / Superset / Supertech)
· Collaborate with BI teams to design datasets and optimize queries
· Support development of dashboards and reporting layers
· Manage access, data refreshes, and performance for BI tools
Required Skills & Qualifications:
· 4–6 years of hands-on experience in data engineering roles
· Strong SQL skills in PostgreSQL (tuning, complex joins, procedures)
· Advanced Python scripting skills for automation and ETL
· Proven experience with Apache Airflow (custom DAGs, error handling)
· Solid understanding of cloud architecture (especially AWS)
· Experience with data marts and dimensional data modeling
· Exposure to modern data stack tools (DBT, Kafka, Snowflake, etc.)
· Familiarity with BI tools like Power BI, Apache Superset, or Supertech BI
· Version control (Git) and CI/CD pipeline knowledge is a plus
· Excellent problem-solving and communication skills
Responsibilities:-
- Responsible for maintaining accurate financial records and transactions
- They prepare financial statements, including income statements and balance sheets
- Accountants analyze financial data to identify trends and provide insights for decision-making. They assist in budgeting, forecasting and monitoring actual performance against targets
- Tax compliance is a key responsibility, including preparing and filing tax returns. Accountants coordinate and facilitate internal or external audits
- They utilize accounting software and generate reports. Compliance with financial regulations and reporting to regulatory bodies is part of the accountant's role
- Assessing financial risks and proposing strategies for mitigation is important. Provide financial advice and guidance based on analysis and market trends
- Visit our client office and collect payments
Requirements:-
- A minimum of 1 year of experience in finance and accounting
- A bachelor's or Master's degree in a relevant Accounting field or a professional accounting qualification
- Proficient in Tally and Management Information Systems(MIS)
- Should have Excellent communication
- Strong knowledge of accounting principles, financial reporting standards, and tax regulations. Analytical skills to interpret financial data and identify patterns
- Effective communication skills for conveying financial information. Proficiency in accounting software and spreadsheet applications
- Problem solving abilities to identify and address financial issues. Technical proficiency in MS Office suite
- Male candidates apply
Rescale Technologies (A Fincash Group Venture), brain child of three IIT Mumbai alumnus is a tech leader which enables businesses to navigate through technological transformation globally. Rescale has played pivotal role's mergers and acquisitions of numerous brands. (https://rescaletechnologies.com/)
Our USP is Creative Technology, we have a new age of imagination and a brilliant team who are compassionate about success. Here is your chance to ride with us:
Designation: Full Stack Developer
Experience: 2 + years
Education: B.E/B.Tech
Location – Lodha iThink, Palava Dombivali
Job description:
Hands-on experience in Angular or React, node js, Elastic
Knowledge of HTML\Web Development, Networking
Basic knowledge of Programming languages is required.
Good Knowledge of SQL/ RDBMS
Fast learner and desire to learn new technologies to keep up with market trends
Produce clean, efficient code based on specifications.
Relevant work experience:
- 5+ years working in the Information Technology Industry
- 2+ years system engineering or platform engineering
Work Location: Hyderabad
Experience:8 to 10 Years
Package:Upto 10 LPA
Notice Period:Immediate to 15 days
Its a Full Time Opportunity with Our Client
Mandatory Skills:#, WCF, MVC, Web API, ASP.Net, Azure,Ajax,Javascript & LINQ
Required Skillset:
• Should have at least 8- 10 years of experience in .Net development.
• Strong knowledge of C#, WCF, MVC, Web API, ASP.Net, Azure
• Good working experience on Ajax, Java Script, jQuery, JSON, HTML/CSS .
• Strong experience with the ASP.NET framework, SQL Server and design/architectural patterns ( e.g., Model - View - Controller (MVC))
• Good knowledge of JavaScript frameworks like jQuery, NodeJS including Installation ,Configuration & Deployment.
• Familiarity with APIs (REST, RPC)
• Sound knowledge of Object- Oriented Principles.
• Understanding of Agile development methodologies
• Working knowledge of LINQ and Web Services.
Job Description :
• Perform project tasks in C#, WCF, MVC, Web API, ASP.Net, Ajax, Java Script, jQuery, JSON,HTML, and Entity Framework etc.
• Lead the team development effort towards successful product delivery.
• Design and implement complex user stories.
• Design and implement reusable components and services .
• Provide technical leadership to teammates through coaching and mentorship.
• Provide architecture guidance to the Product Development team working on products with strong focus on solution architecture, performance, scaling and security.
• Effecti vely resolve problems and roadblocks as they occur, consistently following through on details while driving innovation as well as issue resolution.
• Mentor development and support teams.
• Anticipate and prevent problems and roadblocks before they occur.











