- Overall, 4-5 years of experience
- At least 3 years of HTML/CSS development experience is required
- Must have solid working experience on HTML5, CSS3, XML
- Should be an expert on XPath/Regex expressions for complex website navigations
- Working knowledge on core JavaScript
- Experience of mentoring the junior team members will be desirable
- Must have good communication skills
About Gipfel & Schnell Consultings Pvt Ltd
Similar jobs
*Apply only if you are serving Notice Period
HIRING SQL Developers with max 20 days Of NOTICE PERIOD
Job ID: TNS2023DB01
Who Should apply?
- Only for Serious job seekers who are ready to work in night shift
- Technically Strong Candidates who are willing to take up challenging roles and want to raise their Career graph
- No DBAs & BI Developers, please
Why Think n Solutions Software?
- Exposure to the latest technology
- Opportunity to work on different platforms
- Rapid Career Growth
- Friendly Knowledge-Sharing Environment
Criteria:
- BE/MTech/MCA/MSc
- 2yrs Hands Experience in MS SQL / NOSQL
- Immediate joiners preferred/ Maximum notice period between 15 to 20 days
- Candidates will be selected based on logical/technical and scenario-based testing
- Work time - 10:00 pm to 6:00 am
Note: Candidates who have attended the interview process with TnS in the last 6 months will not be eligible.
Job Description:
- Technical Skills Desired:
- Experience in MS SQL Server, and one of these Relational DB’s, PostgreSQL / AWS Aurora DB / MySQL / any of NoSQL DBs (MongoDB / DynamoDB / DocumentDB) in an application development environment and eagerness to switch
- Design database tables, views, indexes
- Write functions and procedures for Middle Tier Development Team
- Work with any front-end developers in completing the database modules end to end (hands-on experience in the parsing of JSON & XML in Stored Procedures would be an added advantage).
- Query Optimization for performance improvement
- Design & develop SSIS Packages or any other Transformation tools for ETL
- Functional Skills Desired:
- The banking / Insurance / Retail domain would be a
- Interaction with a client a
3. Good to Have Skills:
- Knowledge in a Cloud Platform (AWS / Azure)
- Knowledge on version control system (SVN / Git)
- Exposure to Quality and Process Management
- Knowledge in Agile Methodology
- Soft skills: (additional)
- Team building (attitude to train, work along, and mentor juniors)
- Communication skills (all kinds)
- Quality consciousness
- Analytical acumen to all business requirements
- Think out-of-box for business solution
We are looking for a passionate and experienced Data Analyst to join our team! As a Data Analyst at Oneistox, you will play an extremely important role as your insights and findings will be crucial for our growth and success.
Job Responsibilities
- Execution of data validation, profiling, auditing and data cleansing activities
- Collaboration with internal and external stakeholders
- Development, production and management of data quality reports
- Development of key metrics, rules and notifications to identify critical gaps
- Identify opportunities for business process improvements
- Develop and maintain KPIs Dashboards
- Support all Marketing and Sales data requests
- Standardization and automation of data collection and processing
Job Requirements
- BS in Computer Science, Mathematics or a similar field
- 2-3 years of experience as a Data Analyst or a similar role
- Experience with analytics platforms like Google Analytics, Hubspot and Amplitude is a must
- Familiarity with Javascript is preferred
- Advanced excel a must; Pivot Tables, Macros preferred
- Experience with using a range of data analysis tools
- Advanced analytics capability is a preferred skill
- Understanding of multiple regression analyses
- Experience with performing analysis in a database environment is preferred
Python + Data scientist : |
• Build data-driven models to understand the characteristics of engineering systems |
• Train, tune, validate, and monitor predictive models |
• Sound knowledge on Statistics |
• Experience in developing data processing tasks using PySpark such as reading, merging, enrichment, loading of data from external systems to target data destinations |
• Working knowledge on Big Data or/and Hadoop environments |
• Experience creating CI/CD Pipelines using Jenkins or like tools |
• Practiced in eXtreme Programming (XP) disciplines |
1. Use Python Scrapy to crawl the website
2. Work on dynamic websites and solve crawling challenges
3. Work in a fast-paced startup environment
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
● Frame ML / AI use cases that can improve the company’s product
● Implement and develop ML / AI / Data driven rule based algorithms as software items
● For example, building a chatbot that replies an answer from relevant FAQ, and
reinforcing the system with a feedback loop so that the bot improves
Must have skills:
● Data extraction and ETL
● Python (numpy, pandas, comfortable with OOP)
● Django
● Knowledge of basic Machine Learning / Deep Learning / AI algorithms and ability to
implement them
● Good understanding of SDLC
● Deployed ML / AI model in a mobile / web product
● Soft skills : Strong communication skills & Critical thinking ability
Good to have:
● Full stack development experience
Required Qualification:
B.Tech. / B.E. degree in Computer Science or equivalent software engineering
- 4-5 years proven experience as BI developer
- BI tools such as SiSense (preferred), Tableau, PowerBI
- Ability to create ETL and reporting dashboards
- Expert level proficiency in SQL and data modelling
- Knowledge of JavaScript, D3.js, Chart.js, Visualise.js, JSON, HTML and CSS (Any 2)
- Solid experience on:
1. Developing custom BI solutions and visualisations.
2. Writing relational and multi-dimensional database queries.
3. Debugging, monitoring and troubleshooting BI solutions
4. Creating and deploying reports
5. Designing and troubleshooting BI models.
Nice to Have
- Worked on PostgreSQL, MySQL, MSSQL and Jasper
Soft Skills
1. Ability to take ownership of assigned tasks and work independently
2. Ability to work with the team and other stakeholders as needed; good communication skills
3. Adherence to deadlines and focus on deliverables
4. Strong attention to detail