About iLink Systems
understand and interpret data within the context of the product / business -
solve problems and distill data into actionable recommendations.
Strong communication skills with the ability to confidently work with cross-
functional teams across the globe and to present information to all levels of the
Intellectual and analytical curiosity - initiative to dig into the why, what & how.
Strong number crunching and quantitative skills.
Advanced knowledge of MS Excel and PowerPoint.
Good hands-on SQL
Experience within Google Analytics, Optimize, Tag Manager and other Google Suite tools
Understanding of Business analytics tools & statistical programming languages - R, SAS, SPSS, Tableau is a plus
Inherent interest in e-commerce & marketplace technology platforms and broadly in the consumer Internet & mobile space.
Previous experience of 1+ years working in a product company in a product analytics role
Strong understanding of building and interpreting product funnels.
• Developing effective QlikView/ Sense data models
• Developing front end applications using Qlik technology
• Utilizing scripting language to meet complex business requirements
• Utilizing Qlik Publisher / N printing capabilities
• Extract, transform and load (ETL) data from multiple data sources into the Qlik application
• Design, build, test and debug Qlik solutions based upon specified
• Follow implementation standards
• Utilize source control tools
• Follow deployment process
• Experience creating extract/transform/load routines from data sources
including SAP BW, SAP R/3, MS SQL Server, DB2, Oracle as well as other data sources
• Solid experience developing complex Qlik data models
• Participating in business requirements and design review sessions
• Providing input on proposing, evaluating, and selecting appropriate
design alternatives which meet requirements and are consistent with our
current standards and processes
• Extracting, transforming and loading data into Qlik applications
• Developing, testing, debugging Qlik applications
• Migrating code across development and testing landscapes
• Creating publisher jobs
• Developing documentation
• Transferring knowledge and landing application to BI Support team
• Good communication skills and ability to interact with the customer
• Willingness to travel is mandatory
• Experience on Qlik sense, Geo Analytics an added advantage
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
Job Description :
- Candidate should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- Candidate should have experience for proper understanding of the project deliverables.
- Candidate should be responsible for the respective tasks assigned in the project.
- Candidate will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- Candidate should be involved in client interaction.
- Candidate should possess excellent communication skills.
Required Skills : BI Gateway, MS SQL Server, Tableau, Power BI,.Net , OLAP, UI/UX , Dashboard Building
Experience : 5+Years
Job Location : Remote/Saudi Arabia
Work Timings : 2.30 pm- 11.30 pm
What will you do?
You will help build cutting-edge products in various verticals.
You will have to understand the solution domain and understand/architect the data flow.
You will be accountable for the data models and data pipelines driving the solution.
You will also be researching, and iterating for better solutions and this would involve staying up to speed with the latest technologies in the data space.
Should have a clear understanding of one or more of the below technologies –
• Database: PostgreSQL, MySQL etc.
• BI Reporting: QlikView, Qliksense, SSRS, Tableau & Power BI.
• Cloud – One of AWS, Azure, GCP
• Big Data – Spark SQL, Scala, pySpark, Red Shift, Hive, HDFS, Cloudera
Zemoso Technologies is a Software Product Market Fit Studio that brings silicon valley style rapid prototyping and rapid application builds to Entrepreneurs and Corporate innovation. We offer Innovation as a service and work on ideas from scratch and take it to the Product Market Fit stage using Design Thinking->Lean Execution->Agile Methodology.
We were featured as one of Deloitte's Fastest 50 growing tech companies from India thrice (2016, 2018 and 2019). We were also featured in Deloitte Technology Fast 500 Asia Pacific both in 2016 and 2018.
We are located in Hyderabad, India, and Dallas, US. We have recently incorporated another office in Waterloo, Canada.
Our founders have had past successes - founded a decision management company acquired by SAP AG (now part of Hana Big data stack & NetWeaver BPM), the early engineering team of Zoho (a leading billion $ SaaS player) & some Private Equity experience.
Marquee customers along with some exciting start-ups are part of our clientele.
A proficient, independent contributor that assists in technical design, development, implementation, and support of data pipelines; beginning to invest in less-experienced engineers.
- Design, Create and maintain on premise and cloud based data integration pipelines.
- Assemble large, complex data sets that meet functional/non functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.
- Build analytics tools that utilize the data pipeline to provide actionable insights into key business performance metrics.
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data pipelines to enable BI, Analytics and Data Science teams that assist them in building and optimizing their systems
- Assists in the onboarding, training and development of team members.
- Reviews code changes and pull requests for standardization and best practices
- Evolve existing development to be automated, scalable, resilient, self-serve platforms
- Assist the team in the design and requirements gathering for technical and non technical work to drive the direction of projects
Technical & Business Expertise:
-Hands on integration experience in SSIS/Mulesoft
- Hands on experience Azure Synapse
- Proven advanced level of writing database experience in SQL Server
- Proven advanced level of understanding about Data Lake
- Proven intermediate level of writing Python or similar programming language
- Intermediate understanding of Cloud Platforms (GCP)
- Intermediate understanding of Data Warehousing
- Advanced Understanding of Source Control (Github)
- Be part of the CS Strategy & Operations team that strives to make a difference.
- Assimilate data and analytics to create daily, weekly, monthly and quarterly reports.
- Analyze reports to detect problems during data collection and help review data that has been collected.
- Monitor data to identify trends and anomalies that might showcase abnormal behavior.
- Interpret data, develop dashboards and reports.
- Generate and share routine/ ad-hoc reports as required.
- Work with team members to drive CS processes.
- Monitor, update, and drive implementation of MIS reports.
Our ideal candidate:
- Experience of 2-3 years in a high-transaction MIS environment.
- Knowledge of Google Sheets / Excel (vLookup, pivot table, charts, and adept in trend Metrics)
- Strong knowledge of Power BI, SQL, Data Studio
- Good written and verbal communication.
- Willingness to learn new things and is able to quickly deploy them.
- Is an empathetic team player.
About the Company:
It is a Data as a Service company that helps businesses harness the power of data. Our technology fuels some of the most interesting big data projects of the word. We are a small bunch of people working towards shaping the imminent data-driven future by solving some of its fundamental and toughest challenges.
Role: We are looking for an experienced team lead to drive data acquisition projects end to end. In this role, you will be working in the web scraping team with data engineers, helping them solve complex web problems and mentor them along the way. You’ll be adept at delivering large-scale web crawling projects, breaking down barriers for your team and planning at a higher level, and getting into the detail to make things happen when needed.
- Interface with clients and sales team to translate functional requirements into technical requirements
- Plan and estimate tasks with your team, in collaboration with the delivery managers
- Engineer complex data acquisition projects
- Guide and mentor your team of engineers
- Anticipate issues that might arise and proactively consider those into design
- Perform code reviews and suggest design changes
- Between 5-8 years of relevant experience
- Fluent programming skills and well-versed with scripting languages like Python or Ruby
- Solid foundation in data structures and algorithms
- Excellent tech troubleshooting skills
- Good understanding of web data landscape
- Prior exposure to DOM, XPATH and hands on experience with selenium/automated testing is a plus
Skills and competencies
- Prior experience with team handling and people management is mandatory
- Work independently with little to no supervision
- Extremely high attention to detail
- Ability to juggle between multiple projects
- Own the design, development, testing, deployment, and craftsmanship of the team’s infrastructure and systems capable of handling massive amounts of requests with high reliability and scalability
- Leverage the deep and broad technical expertise to mentor engineers and provide leadership on resolving complex technology issues
- Entrepreneurial and out-of-box thinking essential for a technology startup
- Guide the team for unit-test code for robustness, including edge cases, usability, and general reliability
- In-depth understanding of image processing algorithms, pattern recognition methods, and rule-based classifiers
- Experience in feature extraction, object recognition and tracking, image registration, noise reduction, image calibration, and correction
- Ability to understand, optimize and debug imaging algorithms
- Understating and experience in openCV library
- Fundamental understanding of mathematical techniques involved in ML and DL schemas (Instance-based methods, Boosting methods, PGM, Neural Networks etc.)
- Thorough understanding of state-of-the-art DL concepts (Sequence modeling, Attention, Convolution etc.) along with knack to imagine new schemas that work for the given data.
- Understanding of engineering principles and a clear understanding of data structures and algorithms
- Experience in writing production level codes using either C++ or Java
- Experience with technologies/libraries such as python pandas, numpy, scipy
- Experience with tensorflow and scikit.
Should have Business Intelligence Experience in a data warehouse environment
Should have good experience in writing Power Query, DAX, MDX for complex data projects
Good on Rest Services including the API documentation.
Should have Experience authoring, diagnosing, and altering SQL Server objects and T-SQL
Should have worked on Tabular models in Azure Analysis Services or SSAS
Should have Experience in Microsoft Azure Platform