![Nyteco's logo](/_next/image?url=https%3A%2F%2Fcdnv2.cutshort.io%2Fcompany-static%2F63850dfafed8910025edaf1e%2Fuser_uploaded_data%2Flogos%2FNYTECO_ALTERNATIVE_eqRmTtEF.png&w=3840&q=75)
Join Our Journey
Jules develops an amazing end-to-end solution for recycled materials traders, importers and exporters. Which means a looooot of internal, structured data to play with in order to provide reporting, alerting and insights to end-users. With about 200 tables, covering all business processes from order management, to payments including logistics, hedging and claims, the wealth the data entered in Jules can unlock is massive.
After working on a simple stack made of PostGres, SQL queries and a visualization solution, the company is now ready to set-up its data stack and only misses you. We are thinking DBT, Redshift or Snowlake, Five Tran, Metabase or Luzmo etc. We also have an AI team already playing around text driven data interaction.
As a Data Engineer at Jules AI, your duties will involve both data engineering and product analytics, enhancing our data ecosystem. You will collaborate with cross-functional teams to design, develop, and sustain data pipelines, and conduct detailed analyses to generate actionable insights.
Roles And Responsibilities:
- Work with stakeholders to determine data needs, and design and build scalable data pipelines.
- Develop and sustain ELT processes to guarantee timely and precise data availability for analytical purposes.
- Construct and oversee large-scale data pipelines that collect data from various sources.
- Expand and refine our DBT setup for data transformation.
- Engage with our data platform team to address customer issues.
- Apply your advanced SQL and big data expertise to develop innovative data solutions.
- Enhance and debug existing data pipelines for improved performance and reliability.
- Generate and update dashboards and reports to share analytical results with stakeholders.
- Implement data quality controls and validation procedures to maintain data accuracy and integrity.
- Work with various teams to incorporate analytics into product development efforts.
- Use technologies like Snowflake, DBT, and Fivetran effectively.
Mandatory Qualifications:
- Hold a Bachelor's or Master's degree in Computer Science, Data Science, or a related field.
- Possess at least 4 years of experience in Data Engineering, ETL Building, database management, and Data Warehousing.
- Demonstrated expertise as an Analytics Engineer or in a similar role.
- Proficient in SQL, a scripting language (Python), and a data visualization tool.
- Mandatory experience in working with DBT.
- Experience in working with Airflow, and cloud platforms like AWS, GCP, or Snowflake.
- Deep knowledge of ETL/ELT patterns.
- Require at least 1 year of experience in building Data pipelines and leading data warehouse projects.
- Experienced in mentoring data professionals across all levels, from junior to senior.
- Proven track record in establishing new data engineering processes and navigating through ambiguity.
- Preferred Skills: Knowledge of Snowflake and reverse ETL tools is advantageous.
Grow, Develop, and Thrive With Us
- Global Collaboration: Work with a dynamic team that’s making an impact across the globe, in the recycling industry and beyond. We have customers in India, Singapore, United-States, Mexico, Germany, France and more
- Professional Growth: a highway toward setting-up a great data team and evolve into a leader
- Flexible Work Environment: Competitive compensation, performance-based rewards, health benefits, paid time off, and flexible working hours to support your well-being.
Apply to us directly : https://nyteco.keka.com/careers/jobdetails/41442
![companies logos](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fhiring_companies_logos-v2.webp&w=3840&q=80)
About Nyteco
Nyteco Inc is a green tech venture for the recycled materials industry and manufacturing supply chain.
We serve the industry through our flagship company - Jules AI.
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Freact.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Freact_native.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fc.png&w=32&q=75)
![Nyteco's video section](/_next/image?url=https%3A%2F%2Fimg.youtube.com%2Fvi%2F_hASbRnR9to%2Fmaxresdefault.jpg&w=3840&q=75)
![Nyteco's video section](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fyt_play_red.png&w=3840&q=75)
Nyteco aims to bring leading tech solutions to the recycling industry to help grow its trading business, connect with one another and much more!
Similar jobs
Position: Java Developer
Experience: 3-8 Years
Location: Bengaluru
We are a multi-award-winning creative engineering company offering design and technology solutions on mobile, web and cloud platforms. We are looking for an enthusiastic and self-driven Test Engineer to join our team.
Roles and Responsibilities:
- Expert level Micro Web Services development skills using Java/J2EE/Spring
- Strong in SQL and noSQL databases (mySQL / MongoDB preferred) Ability to develop software programs with best of design patterns , data Structures & algorithms
- Work in very challenging and high performance environment to clearly understand and provide state of the art solutions ( via design and code)
- Ability to debug complex applications and help in providing durable fixes
- While Java platform is primary, ability to understand, debug and work on other application platforms using Ruby on Rails and Python
- Responsible for delivering feature changes and functional additions that handle millions of requests per day while adhering to quality and schedule targets
- Extensive knowledge of at least 1 cloud platform (AWS, Microsoft Azure, GCP) preferably AWS.
- Strong unit testing skills for frontend and backend using any standard framework
- Exposure to application gateways and dockerized. microservices
- Good knowledge and experience with Agile, TDD or BDD methodologies
Desired Profile:
- Programing language – Java
- Framework – Spring Boot
- Good Knowledge of SQL & NoSQL DB
- AWS Cloud Knowledge
- Micro Service Architecture
Good to Have:
- Familiarity with Web Front End (Java Script/React)
- Familiarity with working in Internet of Things / Hardware integration
- Docker & Kubernetes Serverless Architecture
- Working experience in Energy Company (Solar Panels + Battery)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fnet.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fangular.png&w=32&q=75)
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Freact.png&w=32&q=75)
One of our premium-based customers, we are looking to hire a team of Azure .Net Architects in Bangalore/Pune/Noida, looking for Tech Geeks, who have 14+ years of experience full-time.
PFB the JD: Proven Full Stack development skills and have solid business acumen and hands-on experience in architecting solutions
-14+ years Exp
-Expertise in designing and developing Cloud Native applications using Azure Services and managing IaC using Terraform modules and ARM templates
-Cloud Technologies: Azure Services(-(Paas Services, At least experience in some of the services is good enough - Logic Apps, API Management Service, Microsoft Graph API, Azure AD, Azure functions, Cosmos db(mongo db API), Event hub, Stream Analytics, Azure SQL Server, Azure storage table, Azure Storage Queue, Azure blob storage, Azure Event grid, Azure App Service, Application Insights, Azure ARM templates, Azure Worker Roles, Azure web jobs, Azure Relay services, Azure KeyVault, Azure App Services, Logic Apps)
- App Service, Azure Functions, VMs, ASF, AKS, Azure Container Registry, Key Vault, Cosmos DB, Azure SQL, Azure AD, Azure B2C and B2B, APIM, Azure Monitor and App insight
-Containerization tools: Docker, Azure Kubernetes Services, Azure Container Registry
-Infrastructure as Code tools: Terraform, and ARM templates
-DevOps tools: Azure DevOps, CI/CD using Git and Azure Pipeline, Azure Repos
-Microsoft Technologies: ASP.NET Core, .NET Core Web API, .NET, ASP.Net MVC, ADO.Net, TPL, LINQ, PLINQ, WCF Services, ASP .Net Web API, Angular
-Databases: NoSQL/document Databases Azure Cosmos DB, Oracle
-Strong experience in SQL -Experience in Integration or service layer is recommended
L2 Support
Location : Mumbai, Pune, Bangalore
Requirement details : (Mandatory Skills)
- Excell communication skills
- Production Support, Incident Management
- SQL ( Must have experience in writing complex queries )
- Unix ( Must have working experience on Linux operating system.
- Pearl/Shell Scripting
- Candidates working in the Investment Banking domain will be preferred
Roles & Responsibilities
- Part of a Cloud Governance product team responsible for installing, configuring, automating and monitoring various Cloud Services (IaaS, PaaS, and SaaS)
- Be at the forefront of Cloud technology, assisting a global list of customers that consume multiple cloud environments.
- Ensure availability of internal & customers' hosts and services thru monitoring, analysing metric trends, investigating alerts.
- Explore and implement a broad spectrum of open source technologies. Help the team/customer to resolve technical issues.
- Extremely customer focused, flexible to be available on-call for solving critical problems.
- Contribute towards the process improvement involving the Product deployments, Cloud Governance & Customer Success.
-
Skills Required
- Minimum 3+ Years of experience with a B.E/B.Tech
- Experience in managing Azure IaaS, PaaS services for customer production environments
- Well versed in DevOps technologies, automation, infrastructure orchestration, configuration management and CI/CD
- Experience in Linux and Windows Administration, server hardening and security compliance
- Web and Application Server technologies (e.g. Apache, Nginx, IIS)
- Good command in at least one scripting language (e.g. Bash, PowerShell, Ruby, Python)
- Networking protocols such as HTTP, DNS and TCP/IP
- Experience in managing version control platforms (e.g. Git, SVN)
Hybrid Model Available (2 Days WFO & 3 Days WFH) |
WFO: 5 am - 11pm
WFH : 12am- 3am
Flexible Salary bracket
![skill icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fskill_icons%2Fdata_analytics.png&w=32&q=75)
Website: www.https://sgnldomain.online/click?redirect=http%3A%2F%2Ftatvic.com%2F&dID=1615815352065&linkName=tatvic.com" target="_blank">tatvic.com
Job Description
Responsibilities of Technical lAnalyst:
Responsibilities w.r.t Customer:
-
Requirement gathering by asking questions to the customer and getting to the main objective behind the query.
-
Analyze and understand and when necessary document the requirement
-
Design or develop PoC as per the business requirements / Customer development team requirements
-
Completing the implementation of the task as per the schedule
-
Should make extensive use of analytics data to generate insights and recommendations
-
Work with client technical team to fix the tracking issues
Team Responsibilities
-
Participate in the recruitment process based on your seniority. This includes interviews, creating test as required based on the need to recruit people who are compatible with our culture and skilled to accomplish the job.
-
Share and coach colleagues in making the team more effective.
-
Share and Create content for training and publishing. These includes blogs and webinars.
-
Identify potential team members who are worthy of band change and prepare them with required guidance
-
Identify the repetitive tasks and manage to delegate to other team members or automate the process to reduce TAT & improve the productivity
Technical Responsibilities
-
Understanding the Event Schemas and Scripting
-
Designing solutions using GTM configurations for various environments
-
Creating database schemas as per business requirements
-
Create technical specifications and test plans
-
Development within any frameworks/platforms as used by the customers websites platform for implementing tracking components.
-
Understand and implement the various scripts and code which can work with different Analytics Tools
-
Understand the connection of GCP with GA 360 and its working
-
Integrate existing software products and getting various platforms to work together
-
Configure alerts on top of tracking to monitor & correct the same
-
Building reusable, optimized, scalable, secure code and libraries for future use
-
Initiate new research tasks which enable the analyst at Tatvic or Client to leverage to improve the business KPIs
- BE Computer Science, MCA or equivalent
- Cloud app development experience
- Strong Troubleshooting/Debugging experience
- Expert in RabbitMQ internals
- Experience with SQL and NOSQL databases
- Strong communication skills
Experience:
- Min 5 year experience
- Not more than 15 year experience.
- Startup experience is a must.
Location
● Remotely, anywhere in India
1
Timings:
- 40 hours a week but with 4 hours a day overlapping with client timezone. Typically
clients are in California PST Timezone.
Position:
- Full time/Direct
- We have great benefits such as PF, medical insurance, 12 annual company holidays, 12
PTO leaves per year, annual increments, Diwali bonus, spot bonuses and other
incentives etc.
- We dont believe in locking in people with large notice periods. You will stay here
because you love the company. We have only a 15 days notice period.
The role provides L2 and L3 support and IS services to F&PA community using the Cognos Controller Applications
- Problem determination / troubleshooting and root cause analysis skills for our IBM Cognos Controller Cloud and On-Premises offerings
- Create or enhance knowledge assets for our Knowledge Base (how-to guides, technical notes or similar).
- Contribute to key operational metrics: for example: NPS, initial response times, backlog management, Time to Resolution
Technical expertise:
- Previous experience with IBM Cognos Controller, Planning Analytics, or equivalent financial consolidation software (e.g. Anaplan, Tagetek, OneStream, Vena Financial Close Management, Oracle Financial Consolidation, )
- Web tier technologies
- Network administration
- Web applications
- Databases (SQL Server Oracle, DB2)
- Strong troubleshooting communication skills
- Demonstrated organization and time management skills
- Demonstrated verbal and written communication skills
- Must be self-motivated and disciplined
- Ability to recognize and prioritize critical tasks independently
1. Strong in IBM Cognos Controller 10x expertise 2. Strong SQL knowledge (Oracle, MS SQL Server) 3. Good in Data-warehousing concepts with cognos schema 4. Preferred Knowledge on Cognos finance products - Planning/TM1 5. Excellent communication skills
cognos Controller 10.4.2 -On premise experience - this is the keyword for the search for Cognos Controller.
Job Requirement – Sr. SQL _PBI Developer
- Minimum 6+ years of expertise in SQL Database Development including SQL queries, data structures, stored procedures, index etc.
- Excellent written and verbal communication
- Advanced knowledge and experience in T-SQL coding: query writing, stored procedures, triggers and user-defined functions.
- Advanced knowledge and experience in development and maintenance of SSIS and SSAS (Tabular and Multidimensional)
- Advanced knowledge and experience in creating dashboards and reports using PowerBI
- Advanced knowledge and experience in DAX
- Experience in creating and publishing reports using SSRS
- Experience in writing MDX queries
- Manage SQL Server databases through multiple product lifecycle environments, from development to mission-critical production systems.
- Optimize and manage SQL server databases
- Independently analyse, solve, and correct issues in real time, providing problem resolution end-to-end.
- Refine and automate regular processes, track issues, and document changes
- Share domain and technical expertise, providing technical mentorship and cross-training to other peers and team members.
- Flexible, team player, “get-it-done” personality
- Ability to organize and plan work independently
- Ability to work in a rapidly changing environment
- Ability to multi-task and context-switch effectively between different activities and teams
Optional Skills (Good to have):
- Microsoft certification in Azure data engineering
Experience : Fresher / 1-3year
Please see the job description below :
- Gathering requirements, creating workflows and technical documentation
- Operations and coordinating with multiple clients
- Collect, clean, analyze and interpret complex data.
- Prepare reports for all the stakeholders using relevant tools
- Identify areas to increase efficiency and automation of processes
- Produce and track key performance indicators
- Monitor progress by tracking activities, resolving problems and recommending actions.
- Liaise & assist internal stakeholders with necessary data & timely information.
Competencies Required :
- Strong communication skills - able to communicate clearly, effectively and in a timely manner.
- Excellent analytical and critical thinking skills - able to analyze and solve problems.
- Ability to identify and document solutions to complex business problems with accuracy.
- Self-Starting - takes independent action and goes beyond what the job or situation requires.
- University degree in English, Communications, or Technical Writing.
Important :
- Attention to details.
- Knowledge of Microsoft applications - Word, Visio, Microsoft Excel, SQL, Python, HTML, Javascript
- Results oriented - Strives to achieve high levels of individual and organizational performance.
- Strong customer relationship skills
![icon](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fsearch.png&w=48&q=75)
![companies logos](/_next/image?url=https%3A%2F%2Fcdn.cutshort.io%2Fpublic%2Fimages%2Fhiring_companies_logos-v2.webp&w=3840&q=80)