
About Us: We are a group of techies, data analysts, digital marketers, and operations professionals who have come together to provide next-generation e-commerce solutions. We help brands & private labels to grow their businesses on platforms such as Amazon and their own website. We love brainstorming ideas on marketing businesses through digital media and utilize technology to execute them further to achieve the desired goals.
Current Offerings of Technology Department:
- Creating scalable and conversion-friendly eCommerce web platforms using headless integration of Magento and ReactJS
- Building PWAs platforms to provide a better user experience
- Building distributor B2B portals
- Migrating legacy systems to boost conversion and effectiveness preserving their orders, customers, product data and SEO benefits
Key Responsibilities:
- Will be responsible for the scope management for the multiple projects undertaken
- Work directly with clients and consultants to gather requirements to understand their goals and work with the internal team to develop the best solution
- Liaise with external and internal teams to act as the primary technical contact for queries related to the project
- Work with development teams to get them in sync with client requirements
- Improve processes & methodologies to achieve faster and better delivery of project
- Work on internal projects (product and business intelligence projects) to achieve the goal of the respective team
- Write user stories, document the product requirements, workflows & test criteria
- Actively work with Sales team on pre-sales proposal activities & high-level solutioning
- Take initiatives to set up processes in the company and be actively involved in hiring and building the future team
- Be the custodian of functional & non-functional requirements of the projects undertaken
- Work with Product Owner to define, prioritize requirements and accurately maintain requirements traceability.
Skills You Should Have:
- Excellent communicator and relationship builder: A track record of collaborating with a variety of cross-functional teams in a dynamic environment.
- Dynamic: Strong decision-making and prioritization ability. Should be comfortable in dealing with lots of moving pieces. Have attention to detail, and are comfortable learning new technologies and systems
- Team player: Knack for influencing without being authoritative. Pitch in wherever the team needs help, from writing blog posts to supporting customers
- Sense of data: Ability to turn insights into actionable growth initiatives
- Accountability: High sense of ownership and relentlessness to deliver projects with
- High business impact: You value integrity and derive great personal joy from efficient teamwork
- Problem-solving: Good in problem-solving and ability to bring in new ideas and drive project agenda from scratch
- Tools: Experience with tools for capturing and maintaining requirements & mockups/sketches is ideal. i.e. Jira, Zoho
- Consulting approach: Ability to do feasibility study and high-level solutioning for clients
- Soft skills: Good English communication and proactive, collaborative mindset to maintain continuous communication with the clients for project updates and in turn build trust and effective working relationships.
Requirements:
- Experience with all stages of the SDLC, conceptualization to implementation.
- Exposure to Agile methodologies (Scrum)
- Experience working at an agile software company/startup is a bonus
- Strong project instincts; you have a strong intuition about what users want and understand their business
Why should you consider joining Growisto?
- It will be a challenging role and you will get complete ownership to solve challenging problems. If you like challenges and think from a first principle basis, you should definitely take this up
- If you have the aspiration to grow and develop as a leader in parallel to a multifold growth rate of a start-up then you should join us
- If we have to choose between culture/team and profits, our obvious choice is culture. If your thought process and personality resonate with our cultural values, then you should join us
Why should you not consider joining Growisto?
- If the role description does not excite you, then you should not join us
- We are a startup and things will move fast. If you are not comfortable in a fast-paced environment, then you should not join us
● Limitless growth and learning opportunities
● Opportunity to collaborate with multiple stakeholders across hierarchies
● A collaborative and positive culture — Your team will be as smart and driven as you.
● Guidance and mentorship from industry experts and renowned IIT Alumnus
● An opportunity to make an impact — Your work will contribute directly to our strategy and growth

About Growisto
About
Work with the best: Learn from leaders who have built Growisto from the ground up. Work with down-to-earth, highly experienced, and insanely ambitious colleagues
As the business grows, you grow: We want Growisto to be built from within. We want to grow 10x in the next two years and you can play a significant role in helping Growisto achieving the same
More ownership: You will be a key member of the team at Growisto. We will look forward to giving you maximum ownership for your work
Photos
Connect with the team
Similar jobs
The Knowledge Graph Architect is responsible for designing, developing, and implementing knowledge graph technologies to enhance organizational data understanding and decision-making capabilities. This role involves collaborating with data scientists, engineers, and business stakeholders to integrate complex data into accessible and insightful knowledge graphs.
Work you’ll do
1. Design and develop scalable and efficient knowledge graph architectures.
2. Implement knowledge graph integration with existing data systems and business processes.
3. Lead the ontology design, data modeling, and schema development for knowledge representation.
4. Collaborate with IT and business units to understand data needs and deliver comprehensive knowledge graph solutions.
5. Manage the lifecycle of knowledge graph data, including quality, consistency, and updates.
6. Provide expertise in semantic technologies and machine learning to enhance data interconnectivity and retrieval.
7. Develop and maintain documentation and specifications for system architectures and designs.
8. Stay updated with the latest industry trends in knowledge graph technologies and data management.
The Team
Innovation & Technology anticipates how technology will shape the future and begins building future capabilities and practices today. I&T drives the Ideation, Incubation and scale of hybrid businesses and tech enabled offerings at prioritized offering portfolio and industry interactions.
It drives cultural and capability transformation from solely services – based businesses to hybrid businesses. While others bet on the future, I&T builds it with you.
I&T encompasses many teams—dreamers, designers, builders—and partners with the business to bring a unique POV to deliver services and products for clients.
Qualifications and Experience
Required:
1. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field.
2. 6-10 years of professional experience in data engineering with Proven experience in designing and implementing knowledge graph systems.
3. Strong understanding of semantic web technologies (RDF, SPARQL, GraphQL,OWL, etc.).
4. Experience with graph databases such as Neo4j, Amazon Neptune, or others.
5. Proficiency in programming languages relevant to data management (e.g., Python, Java, Javascript).
6. Excellent analytical and problem-solving abilities.
7. Strong communication and collaboration skills to work effectively across teams.
Preferred:
1. Experience with machine learning and natural language processing.
2. Experience with Industry 4.0 technologies and principles
3. Prior exposure to cloud platforms and services like AWS, Azure, or Google Cloud.
4. Experience with containerization technologies like Docker and Kubernetes
Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
- Big data developer with 8+ years of professional IT experience with expertise in Hadoop ecosystem components in ingestion, Data modeling, querying, processing, storage, analysis, Data Integration and Implementing enterprise level systems spanning Big Data.
- A skilled developer with strong problem solving, debugging and analytical capabilities, who actively engages in understanding customer requirements.
- Expertise in Apache Hadoop ecosystem components like Spark, Hadoop Distributed File Systems(HDFS), HiveMapReduce, Hive, Sqoop, HBase, Zookeeper, YARN, Flume, Pig, Nifi, Scala and Oozie.
- Hands on experience in creating real - time data streaming solutions using Apache Spark core, Spark SQL & DataFrames, Kafka, Spark streaming and Apache Storm.
- Excellent knowledge of Hadoop architecture and daemons of Hadoop clusters, which include Name node,Data node, Resource manager, Node Manager and Job history server.
- Worked on both Cloudera and Horton works in Hadoop Distributions. Experience in managing Hadoop clustersusing Cloudera Manager tool.
- Well versed in installation, Configuration, Managing of Big Data and underlying infrastructure of Hadoop Cluster.
- Hands on experience in coding MapReduce/Yarn Programs using Java, Scala and Python for analyzing Big Data.
- Exposure to Cloudera development environment and management using Cloudera Manager.
- Extensively worked on Spark using Scala on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL/Oracle .
- Implemented Spark using PYTHON and utilizing Data frames and Spark SQL API for faster processing of data and handled importing data from different data sources into HDFS using Sqoop and performing transformations using Hive, MapReduce and then loading data into HDFS.
- Used Spark Data Frames API over Cloudera platform to perform analytics on Hive data.
- Hands on experience in MLlib from Spark which are used for predictive intelligence, customer segmentation and for smooth maintenance in Spark streaming.
- Experience in using Flume to load log files into HDFS and Oozie for workflow design and scheduling.
- Experience in optimizing MapReduce jobs to use HDFS efficiently by using various compression mechanisms.
- Working on creating data pipeline for different events of ingestion, aggregation, and load consumer response data into Hive external tables in HDFS location to serve as feed for tableau dashboards.
- Hands on experience in using Sqoop to import data into HDFS from RDBMS and vice-versa.
- In-depth Understanding of Oozie to schedule all Hive/Sqoop/HBase jobs.
- Hands on expertise in real time analytics with Apache Spark.
- Experience in converting Hive/SQL queries into RDD transformations using Apache Spark, Scala and Python.
- Extensive experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS).
- Experience in Microsoft cloud and setting cluster in Amazon EC2 & S3 including the automation of setting & extending the clusters in AWS Amazon cloud.
- Extensively worked on Spark using Python on cluster for computational (analytics), installed it on top of Hadoop performed advanced analytical application by making use of Spark with Hive and SQL.
- Strong experience and knowledge of real time data analytics using Spark Streaming, Kafka and Flume.
- Knowledge in installation, configuration, supporting and managing Hadoop Clusters using Apache, Cloudera (CDH3, CDH4) distributions and on Amazon web services (AWS).
- Experienced in writing Ad Hoc queries using Cloudera Impala, also used Impala analytical functions.
- Experience in creating Data frames using PySpark and performing operation on the Data frames using Python.
- In depth understanding/knowledge of Hadoop Architecture and various components such as HDFS and MapReduce Programming Paradigm, High Availability and YARN architecture.
- Establishing multiple connections to different Redshift clusters (Bank Prod, Card Prod, SBBDA Cluster) and provide the access for pulling the information we need for analysis.
- Generated various kinds of knowledge reports using Power BI based on Business specification.
- Developed interactive Tableau dashboards to provide a clear understanding of industry specific KPIs using quick filters and parameters to handle them more efficiently.
- Well Experience in projects using JIRA, Testing, Maven and Jenkins build tools.
- Experienced in designing, built, and deploying and utilizing almost all the AWS stack (Including EC2, S3,), focusing on high-availability, fault tolerance, and auto-scaling.
- Good experience with use-case development, with Software methodologies like Agile and Waterfall.
- Working knowledge of Amazon's Elastic Cloud Compute( EC2 ) infrastructure for computational tasks and Simple Storage Service ( S3 ) as Storage mechanism.
- Good working experience in importing data using Sqoop, SFTP from various sources like RDMS, Teradata, Mainframes, Oracle, Netezza to HDFS and performed transformations on it using Hive, Pig and Spark .
- Extensive experience in Text Analytics, developing different Statistical Machine Learning solutions to various business problems and generating data visualizations using Python and R.
- Proficient in NoSQL databases including HBase, Cassandra, MongoDB and its integration with Hadoop cluster.
- Hands on experience in Hadoop Big data technology working on MapReduce, Pig, Hive as Analysis tool, Sqoop and Flume data import/export tools.
- Hands-on knowledge on various CI-CD tools (Jenkins/TeamCity, Artifactory, UCD, Bitbucket/Github, SonarQube) including setting up of build-deployment automated pipelines.
- Very good knowledge in scripting tools and languages such as Shell, Perl or Python , YAML/Groovy, build tools such as Maven/Gradle.
- Hands-on knowledge in containerization and orchestration tools such as Docker, OpenShift and Kubernetes.
- Good knowledge in configuration management tools such as Ansible, Puppet/Chef and have worked on setting up of monitoring tools (Splunk/Geneos/New Relic/Elk).
- Expertise in job schedulers/workload automation tools such as Control-M or AutoSys is good to have.
- Hands-on knowledge on Cloud technology (preferably GCP) including various computing services and infrastructure setup using Terraform.
- Should have basic understanding on networking, certificate management, Identity and Access Management and Information security/encryption concepts.
- • Should support day-to-day tasks related to platform and environments upkeep such as upgrades, patching, migration and system/interfaces integration.
- • Should have experience in working in Agile based SDLC delivery model, multi-task and support multiple systems/apps.
- • Big-data and Hadoop ecosystem knowledge is good to have but not mandatory.
- Should have worked on standard release, change and incident management tools such as ServiceNow/Remedy or similar
Platform for creators to build communities and monetise them Social Networks have become more media platforms than “social” platforms. They are not designed to make real human connections. Demand for more intimate online places is higher than ever. Communities are in huge demand! Also, these “Ad-driven” platforms only reward top creators. Most creators are left out - no matter how engaged their audience is. They have to constantly beat algorithms to reach out to their own audience. Over 50 million people around the world consider themselves creators. 97% of them aren’t getting paid. As a result creators are flocking to platforms like Slack, Discord, Whatsapp & Telegram - none of which were built either for community engagement or creator monetisation. Community creators have to go through the pain of managing multiple third party tools to engage and monetise their community. LikeMinds helps creators to convert their audience into a branded private community. They can monetise it by selling digital products like memberships, group chats, events, consultations and courses. As a platform purpose-built for community entrepreneurs, LikeMinds has inbuilt tools for member acquisition, onboarding, retention, moderation, monetisation, and referrals. Our platform enables creators to drive high engagement and belonging via chat rooms, events, polls, member directory, and rewards. These features are in addition to powerful business tools like community website, subscription management, event reminders, analytics dashboard, cohort creation, in-app banners, data export, and much more. Custom integrations are also available for creators with existing assets. In the last few months, we have helped 100+ creators monetise their communities. These include entertainers, fitness & wellness trainers, professional domain experts, language instructors, life coaches, financial experts, micro-entrepreneurship coaches, and exam prep coaches. We are a team of entrepreneurs, techies and community builders with credible backgrounds supported by 2 large VCs
Skills Required -JavaScript, CSS, HTML 5, Angular -Strong knowledge of Angular is a must -Fluent in responsive design and mobile / tablet UI / UX -Web mobile (touch devices: mobiles and tablets) experience is expected -UI Performance: should have hands-on experience of optimizing page load, rendering, caching -Good understanding of AJAX and its UI implications -Understanding of REST Apis -Have worked on high scale growth projects in the past -Prior experience working in a tech startup is a plus
Senior Software Engineer
The Company strives to provide a great service and to deliver excellent user experiences that improve the value of our customers.
We are seeking a new developer to join our development team. We work on a variety of projects using various exciting technologies such as Ruby on Rails, AWS, and JS frameworks such as Node and React to build outstanding webapps and provide business support to meet our customer's needs.
The ideal candidate will be someone who loves to learn and is willing to look for new and innovative solutions to problems and challenges for our customers. They may or may not have a wealth of experience but, the willingness to learn, try new things, and can think problems through and analyze available options is more meaningful, we can guide you the rest. You will have opportunities to combine your technical ability, critical thinking, and design experience.
Cool things to have:
- Ability to communicate clearly. Programming isn't just about writing code, it's about being able to communicate ideas with your team, customers, and the machine
- Keen desire to We are constantly aspiring to improve our work and ourselves, by constantly learning we can do this.
- Self-motivated. Even if a decision turns out to be wrong, this is better than being Trying and learning from failures is hugely valuable.
- Knowing when to ask for Try for as long as you can but if you get stuck, ask for help. Your teammates are here to support you and your development.
- Critical Be able to base decisions on the information available, not just blindly accepting solutions without investigation.
Technical Requirements:
- Strong programming experience. We primarily write in Ruby, so experience here is a
- Knowledge of Object-Oriented design patterns as well as Test-Driven
- Be familiar with Rails, Sinatra, etc, and how these can be used to implement RESTful
- Knowledge of RDBMS such as Postgres or MySQL and SQL to run queries against
Nice to have skills
- Knowledge of any AWS systems such as EC2, Lambda, RDS,
- Knowledge of API design patterns, REST, RPC, and can explain the advantages and disadvantages.
- Knowledge of DevOps skills such as Terraform, CI/CD,
We are looking for candidates that have experience in development and have performed CI/CD based projects. Should have a good hands-on Jenkins Master-Slave architecture, used AWS native services like CodeCommit, CodeBuild, CodeDeploy and CodePipeline. Should have experience in setting up cross platform CI/CD pipelines which can be across different cloud platforms or on-premise and cloud platform.
Job Description:
- Hands on with AWS (Amazon Web Services) Cloud with DevOps services and CloudFormation.
- Experience interacting with customer.
- Excellent communication.
- Hands-on in creating and managing Jenkins job, Groovy scripting.
- Experience in setting up Cloud Agnostic and Cloud Native CI/CD Pipelines.
- Experience in Maven.
- Experience in scripting languages like Bash, Powershell, Python.
- Experience in automation tools like Terraform, Ansible, Chef, Puppet.
- Excellent troubleshooting skills.
- Experience in Docker and Kuberneties with creating docker files.
- Hands on with version control systems like GitHub, Gitlab, TFS, BitBucket, etc.
About Us
Instavans is a Logistics tech startup focused on the development of SaaS products for the Road Trucking industry. Instavans has built and rolled out SmarTruck™, a SaaS based TMS with a private marketplace. SmarTruck has customers in India & the MENA region and is poised to enter Latam and Sub-saharan Africa.
The company’s innovative go-to-market plan involves working with large Shippers and 4PLs (Fourth Party Logistics Service Providers) to synergistically bring their Carriers on board.
We are looking for people to help us supercharge our growth, while we continue to embody the values that have got us so far already. As we are a rapidly growing company you will gain exposure to all areas of the platform, understanding the key success drivers at an engineering team and gaining invaluable experience for your future career. Success in this role will lead to opportunities for growth across the entire engineering team with significant scope for future development.
We have openings in Frontend (Angular 8+), Backend(Node.js), Android (Kotlin) and QA (Automation)
Job Description: -
Comm-IT is seeking an enthusiastic Web Developer to help our fast-growing development team in India. You will play an important role and be a part of creating a product used by millions of people.
Responsibilities: -
- Design and build advanced web applications / products for desktop browsers, mobile web & feature phone.
- You work in agile cross-functional team to create great customer experiences and implement high quality code into production.
- Work on bug fixing and improving application performance.
- Continuously discover, evaluate, and implement new technologies to maximize development efficiency.
- You prioritize performance in all your development - both network and rendering
Requirements: -
- Minimum of 6 months - 1 year of experience in software development.
- This project requires full stack experience creating applications and products using .Net and C# development, HTML 5, jQuery library and CSS3.
- Strong Web backend knowledge in Asp.Net Web API, Razer pages, .Net Entity Framework and MS SQL scripting.
- Experience developing REST-based APIs.
- Knowledge in TypeScript / JavaScript front-end back-end development experience
- Knowledge in CSS & layout skills – modern Bootstrap, CSS layout techniques
- Good understanding of object-oriented programming and its implementation
- Interest working in Cloud-based technologies.
- Confidence to work as part of a Scrum Team.
- Desire to learn new
Nice to have (Add-On): -
- Knowledge of multiple back-end languages (e.g., C#, Java, Python) and JavaScript frameworks (e.g., Angular, React, Node.js, Blazor Framework)
- Familiarity with databases (e.g., MySQL, MongoDB), web servers (e.g., Apache) and UI/UX design
- Experience creating automated unit tests and functional test.






