Join Our Journey
Jules develops an amazing end-to-end solution for recycled materials traders, importers and exporters. Which means a looooot of internal, structured data to play with in order to provide reporting, alerting and insights to end-users. With about 200 tables, covering all business processes from order management, to payments including logistics, hedging and claims, the wealth the data entered in Jules can unlock is massive.
After working on a simple stack made of PostGres, SQL queries and a visualization solution, the company is now ready to set-up its data stack and only misses you. We are thinking DBT, Redshift or Snowlake, Five Tran, Metabase or Luzmo etc. We also have an AI team already playing around text driven data interaction.
As a Data Engineer at Jules AI, your duties will involve both data engineering and product analytics, enhancing our data ecosystem. You will collaborate with cross-functional teams to design, develop, and sustain data pipelines, and conduct detailed analyses to generate actionable insights.
Roles And Responsibilities:
- Work with stakeholders to determine data needs, and design and build scalable data pipelines.
- Develop and sustain ELT processes to guarantee timely and precise data availability for analytical purposes.
- Construct and oversee large-scale data pipelines that collect data from various sources.
- Expand and refine our DBT setup for data transformation.
- Engage with our data platform team to address customer issues.
- Apply your advanced SQL and big data expertise to develop innovative data solutions.
- Enhance and debug existing data pipelines for improved performance and reliability.
- Generate and update dashboards and reports to share analytical results with stakeholders.
- Implement data quality controls and validation procedures to maintain data accuracy and integrity.
- Work with various teams to incorporate analytics into product development efforts.
- Use technologies like Snowflake, DBT, and Fivetran effectively.
Mandatory Qualifications:
- Hold a Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- Possess at least 4 years of experience in Data Engineering, ETL Building, database management, and Data Warehousing.
- Demonstrated expertise as an Analytics Engineer or in a similar role.
- Proficient in SQL, a scripting language (Python), and a data visualization tool.
- Mandatory experience in working with DBT.
- Experience in working with Airflow, and cloud platforms like AWS, GCP, or Snowflake.
- Deep knowledge of ETL/ELT patterns.
- Require at least 1 year of experience in building Data pipelines and leading data warehouse projects.
- Experienced in mentoring data professionals across all levels, from junior to senior.
- Proven track record in establishing new data engineering processes and navigating through ambiguity.
- Preferred Skills: Knowledge of Snowflake and reverse ETL tools is advantageous.
Grow, Develop, and Thrive With Us
- Global Collaboration: Work with a dynamic team that’s making an impact across the globe, in the recycling industry and beyond. We have customers in India, Singapore, United-States, Mexico, Germany, France and more
- Professional Growth: a highway toward setting-up a great data team and evolve into a leader
- Flexible Work Environment: Competitive compensation, performance-based rewards, health benefits, paid time off, and flexible working hours to support your well-being.
Apply to us directly : https://nyteco.keka.com/careers/jobdetails/41442
About Nyteco
About
Nyteco Inc is a green tech venture for the recycled materials industry and manufacturing supply chain.
We serve the industry through our flagship company - Jules AI.
Tech stack
Company video
Candid answers by the company
Nyteco aims to bring leading tech solutions to the recycling industry to help grow its trading business, connect with one another and much more!
Product showcase
Connect with the team
Similar jobs
Role Summary:
Join our dynamic team as a LinkedIn/Google PPC Specialist, where you will manage, optimize, and grow our PPC strategy and execution across major platforms, particularly LinkedIn and Google Ads. This role involves campaign setup, day-to-day management, and tracking performance to maximize ROI.
Responsibilities:
-Launch and optimize various PPC campaigns across digital platforms.
-Oversee accounts on search platforms (e.g., Google AdWords, LinkedIn).
-Monitor budget and adjust bids to gain better ROI.
-Track KPIs to assess performance and pinpoint issues.
-Produce reports for management on the performance of campaigns.
Requirements:
-Proven experience as a PPC Manager or Digital Marketing Specialist.
-Experience in data analysis and reporting.
-Knowledge of SEO and digital marketing concepts.
-Familiarity with multiple platforms (e.g. AdWords, Bing etc) is preferred.
-Working knowledge of analytics tools (Google Analytics, Tableau, WebTrends etc.).
-It would be preferable to have some experience in a similar industry as us.
Required Experience:
1) 3-10 years in Adv Analytics, Predictive Modelling, Data Science and Machine Learning.
2) R Programming, R Studio, R Server/ SQL proficiency is a must.
Key Responsibilities:
- Coding, Programming and Development: The key responsibility is to perform hands-on coding and build new R models and/or make changes in relevant technologies and frameworks, ensuring project deliverables meet stringent quality and performance standards to existing R Studio Models.
- Client Engagement & Requirements Gathering: Independently engage with client stakeholders to understand data landscapes and requirements, translating them into functional and technical specifications.
- Support and Maintenance: Manage and Support R Models in production environment.
- API & Database Management: Integrate APIs and manage databases (e.g., Hive, Greenplum, PostgreSQL, Oracle) to support seamless data flows.
As a Kafka Administrator at Cargill you will work across the full set of data platform technologies spanning on-prem and SAS solutions empowering highly performant modern data centric solutions. Your work will play a critical role in enabling analytical insights and process efficiencies for Cargill’s diverse and complex business environments. You will work in a small team who shares your passion for building, configuring, and supporting platforms while sharing, learning and growing together.
- Develop and recommend improvements to standard and moderately complex application support processes and procedures.
- Review, analyze and prioritize incoming incident tickets and user requests.
- Perform programming, configuration, testing and deployment of fixes or updates for application version releases.
- Implement security processes to protect data integrity and ensure regulatory compliance.
- Keep an open channel of communication with users and respond to standard and moderately complex application support requests and needs.
MINIMUM QUALIFICATIONS
- 2-4 year of minimum experience
- Knowledge of Kafka cluster management, alerting/monitoring, and performance tuning
- Full ecosystem Kafka administration (kafka, zookeeper, kafka-rest, connect)
- Experience implementing Kerberos security
- Preferred:
- Experience in Linux system administration
- Authentication plugin experience such as basic, SSL, and Kerberos
- Production incident support including root cause analysis
- AWS EC2
- Terraform
We are looking for a Backend Engineer at Prescribe You will be responsible for architecting and developing backend for the features being added in the mobile app. You will be joining a talented, collaborative team that is very passionate about solving this massive problem.
Location:
Work from Home
About Prescribe:
Prescribe (YC W21) is one of India's fastest-growing startups in the Healthcare sector, founded by IIT Madras alumni. We are building a D2C brand GetYara in the natural healthcare space.
Requirements and Responsibilities:
Below is a list of several skills required to deliver on responsibilities for this role:
- Comfortable with AWS infrastructure.
- Sound knowledge in JS ES6.
- Great at problem-solving
- Experience with Amplify and Cloud formation is a plus.
- Bonus: Graphql, Elastic Search, SQL
Benefits
- Work flexibility
- Medical insurance
- Work from Home
- Stock Options based on performance
If you always thought of yourself as entrepreneurial, customer-obsessed, results-oriented, strategic yet execution-focused, hungry and passionate about technology, we have a dream opportunity to back yourself up.
- Optimizing components for maximum performance across a vast array of web-capable devices and browsers
- Ability to understand business requirements and translate them into technical requirements
- You will ensure that these components and the overall application are robust and easy to maintain.
- Have a good understanding of design and user experience principles.
- Open-minded, flexible, and willing to adapt to changing situations
- Ability to work independently as well as on a team and learn from colleagues
- High adaptability in a dynamic start-up environment
- Ensuring technologies are used efficiently, profitably, and securely. Evaluating and implementing new systems and infrastructure.
- Excellent troubleshooting, analytical and problem-solving abilities with a tenacious commitment to finding the root cause of issues.
Qualifications :
- B. Tech/ B.E. /M. Tech/ BCA / MCA or a related technical discipline from reputed universities
Skills Required :
- Minimum 1+ years of experience in REST API development using python.
- Must have experience with anyone popular python frameworks like Django, Flask, and FastAPI Framework.
- Experience in designing and developing Restful Web services and Remote procedure calls.
- Able to integrate multiple data sources and databases into one system
- Knowledge of load testing and optimizing code for performance, security, and scalability
- Experience with Unit testing frameworks in Python
- Knowledge of modern authorization & authentication mechanisms.
- Able to create database schemes that represent and support business processes
- Knowledge of Version Control System & basics of CI/CD
- Write and maintain technical documentation
- Solid foundation in OOPS, Data structures, and Algorithms
- Knowledge of SDLC phases of project development.
- Experience in one or more No SQL Databases such as MongoDB, and Cassandra.
- Experience with event-based databases and programming
- Good experience in synchronous programming and thread-based programming.
- Experience with schedulers.
- Good to have knowledge of Linux shell commands.
- Good to have experience in Kubernetes & Docker
- Knowledge of any Cloud Service like GCP or AWS would be an added advantage
- Cares deeply about writing Quality, Testable & Modular code
- Good to have an understanding of NLP.
Salary Range - 15 to 25 LPA
Job Description
We are looking for a React Native developer interested in building performant mobile apps on both the iOS and Android platforms. You will be responsible for architecting and building these applications, as well as coordinating with the teams responsible for other layers of the product infrastructure. Building a product is a highly collaborative effort, and as such, a strong team player with a commitment to perfection is required.
Responsibilities
- Build pixel-perfect, buttery smooth UIs across both mobile platforms.
- Leverage native APIs for deep integrations with both platforms.
- Diagnose and fix bugs and performance bottlenecks for performance that feels native.
- Reach out to the open-source community to encourage and help implement mission-critical software fixes—React Native moves fast and often breaks things.
- Maintain code and write automated tests to ensure the product is of the highest quality.
- Transition existing React web apps to React Native.
Skills - Firm grasp of the JavaScript language and its nuances, including ES6+ syntax
- Knowledge of object-oriented programming
- Ability to write well-documented, clean Javascript code
- Rock-solid at working with third-party dependencies and debugging dependency conflicts
- Familiarity with native build tools, like XCode, Gradle
- Understanding of REST APIs, the document request model, and offline storage
- Experience with automated testing suite
JD:
Your role will include:
- Writing and testing your code, innovating and contributing towards increasing the value delivered by your team.
- Setting a high bar through your design, development, analysis and deployment activities
- Understanding and participating in evolving the architecture of our products.
- Keeping up-to-date with new technologies, best practices, and work on optimizing the tooling and automation.
- Understanding the latest development and engineering paradigms like Scrum/Agile/TDD/BDD/DDD etc.
You have experience with the following:
- Strong experience of leading and being part of technical teams preferably following agile methodology.
- Strong technical background with ability to provide technical guidance to other team members.
- Knowledge of microservices and must have experience of implementing a few microservices by the least.
- Knowledge of API driven platform development & Software Integration.
- You have hands-on experience in building secure, high-performing and scalable systems in Java.
- Exposure to JVM based languages like Java, Scala, Clojure.
We are looking for highly motivated engineers with the following skillset-
- Engineering degree in Computer science or ECE or MCA is a must
- In-depth knowledge of PHP, HTML5, CSS, jQuery, JavaScript, AJAX
- Experience using PHP frameworks such as Laravel, Symfony, Zend, etc. - any 1 is required
- VueJS, AngularJS, ReactJS - knowledge of any 1 is required
- Between 1-5 years of work experience is required
- Very strong data structure and algorithm knowledge
- Ideal candidate would be a very strong developer with great coding skills
- Good English communication skills
- Willingness to work on multiple projects simultaneously
- Ability to work through ambiguous requirements and come up with a clear scope and problem statement
- If you have any live projects portfolio or GitHub handle please share them
- Office is in Fairlie place (Dalhousie), Kolkata 700 001