11+ Generalized linear model Jobs in India
Apply to 11+ Generalized linear model Jobs on CutShort.io. Find your next job, effortlessly. Browse Generalized linear model Jobs and apply today!
Senior Big Data Engineer
Note: Notice Period : 45 days
Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA.
We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure.
It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges.
Key Qualifications
· 5+ years of experience working with Java and Spring technologies
· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations
· Knowledge of microservices architecture is plus
· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra
· Experience with Kafka or any streaming tools
· Knowledge of Scala would be preferable
· Experience with agile application development
· Exposure of any Cloud Technologies including containers and Kubernetes
· Demonstrated experience of performing DevOps for platforms
· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity
· Exposure to Graph databases
· Passion for learning new technologies and the ability to do so quickly
· A Bachelor's degree in a computer-related field or equivalent professional experience is required
Key Responsibilities
· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture
· Design and develop the big data-focused micro-Services
· Involve in big data infrastructure, distributed systems, data modeling, and query processing
· Build software with cutting-edge technologies on cloud
· Willing to learn new technologies and research-orientated projects
· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed
Responsibilities
Researches, develops and maintains machine learning and statistical models for
business requirements
Work across the spectrum of statistical modelling including supervised,
unsupervised, & deep learning techniques to apply the right level of solution to
the right problem Coordinate with different functional teams to monitor outcomes and refine/
improve the machine learning models Implements models to uncover patterns and predictions creating business value and innovation
Identify unexplored data opportunities for the business to unlock and maximize
the potential of digital data within the organization
Develop NLP concepts and algorithms to classify and summarize structured/unstructured text data
Qualifications
3+ years of experience solving complex business problems using machine
learning.
Fluency in programming languages such as Python, NLP and Bert, is a must
Strong analytical and critical thinking skills
Experience in building production quality models using state-of-the-art technologies
Familiarity with databases .
desirable Ability to collaborate on projects and work independently when required.
Previous experience in Fintech/payments domain is a bonus
You should have Bachelor’s or Master’s degree in Computer Science, Statistics
or Mathematics or another quantitative field from a top tier Institute
Company Profile :
Merilytics, an Accordion company is a fast-growing analytics firm offering advanced a and intelligent analytical solutions to clients globally. We combine domain expertise, advanced analytics, and technology to provide robust solutions for clients' business problems. You can find further details about the company at https://merilytics.com.
We partner with our clients in Private Equity, CPG, Retail, Healthcare, Media & Entertainment, Technology, Logistics industries etc. by providing analytical solutions to generate superior returns. We solve clients' business problems by analyzing large amount of data to help guide their Operations, Marketing, Pricing, Customer Strategies, and much more.
Position :
- Business Associate at Merilytics will be working on complex analytical projects and is the primary owner of the work streams involved.
- The Business Associates are expected to lead the team of Business Analysts to deliver robust analytical solutions consistently and mentor the Analysts for professional development.
Location : Hyderabad
Roles and Responsibilities :
The roles and responsibilities of a Business Associate will include the below:
- Proactively provide thought leadership to the team and have complete control on the delivery process of the project.
- Understand the client's point of view and translate it into sound judgment calls in ambiguous analytical situations.
- Highlight potential analytical issues upfront and resolve them independently.
- Synthesizes the analysis and derives insights independently.
- Identify the crux of the client problem and leverage it to draw relevant actionable insights from the analysis/work.
- Ability to manage multiple Analysts and provide customized guidance for individual development.
- Resonate with our five core values - Client First, Excellence, Integrity, Respect and Teamwork.
Pre-requisites and skillsets required to apply for this role :
- Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred.
- Should have 2-4 years of experience.
- Strong leadership & proactive communication to coordinate with the project team and other internal stakeholders.
- Ability to use business judgement and a structured approach towards solving complex problems.
- Experience in client-facing/professional services environment is a plus.
- Strong hard skills on analytics tools such as R, Python, SQL, and Excel is a plus.
Why Explore a Career at Merilytics :
- High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility.
- Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes.
- Entrepreneurial Environment: Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities.
- Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve.
Other benefits for full time employees:
(i) Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctor's consultations, counselors, etc.
(ii) Corporate Meal card options for ease of use and tax benefits.
(iii) Work dinners, team lunches, company sponsored team outings and celebrations.
(iv) Reimbursement support for travel to the office, as and when promulgated by the Company.
(v) Cab reimbursement for women employees beyond a certain time of the day.
(vi) Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests.
(vii) Reward and recognition platform to celebrate professional and personal milestones.
(viii) A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.
PLSQL Developer
experience of 4 to 6 years
Skills- MS SQl Server and Oracle, AWS or Azure
• Experience in setting up RDS service in cloud technologies such as AWS or Azure
• Strong proficiency with SQL and its variation among popular databases
• Should be well-versed in writing stored procedures, functions, packages, using collections,
• Skilled at optimizing large, complicated SQL statements.
• Should have worked in migration projects.
• Should have worked on creating reports.
• Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.
• Knowledge of best practices when dealing with relational databases
• Capable of troubleshooting common database issues
• Familiar with tools that can aid with profiling server resource usage and optimizing it.
• Proficient understanding of code versioning tools such as Git and SVN
world’s first real-time opportunity engine. We constantly cr
● Statistics - Always makes data-driven decisions using tools from statistics, such as: populations and
sampling, normal distribution and central limit theorem, mean, median, mode, variance, standard
deviation, covariance, correlation, p-value, expected value, conditional probability and Bayes's theorem
● Machine Learning
○ Solid grasp of attention mechanism, transformers, convolutions, optimisers, loss functions,
LSTMs, forget gates, activation functions.
○ Can implement all of these from scratch in pytorch, tensorflow or numpy.
○ Comfortable defining own model architectures, custom layers and loss functions.
● Modelling
○ Comfortable with using all the major ML frameworks (pytorch, tensorflow, sklearn, etc) and NLP
models (not essential). Able to pick the right library and framework for the job.
○ Capable of turning research and papers into operational execution and functionality delivery.
Digital center Technology Solution Company
In this role, we are looking for:
- A problem-solving mindset with the ability to understand business challenges and how to apply your analytics expertise to solve them.
- The unique person who can present complex mathematical solutions in a simple manner that most will understand, using data visualization techniques to tell a story with data.
- An individual excited by innovation and new technology and eager to finds ways to employ these innovations in practice.
- A team mentality, empowered by the ability to work with a diverse set of individuals.
- A passion for data, with a particular emphasis on data visualization.
Basic Qualifications
- A Bachelor’s degree in Data Science, Math, Statistics, Computer Science or related field with an emphasis on data analytics.
- 5+ Years professional experience, preferably in a data analyst / data scientist role or similar, with proven results in a data analyst role.
- 3+ Years professional experience in a leadership role guiding high-performing, data-focused teams with a track record of building and developing talent.
- Proficiency in your statistics / analytics / visualization tool of choice, but preferably in the Microsoft Azure Suite, including PowerBI and/or AzureML.
JD for IOT DE:
The role requires experience in Azure core technologies – IoT Hub/ Event Hub, Stream Analytics, IoT Central, Azure Data Lake Storage, Azure Cosmos, Azure Data Factory, Azure SQL Database, Azure HDInsight / Databricks, SQL data warehouse.
You Have:
- Minimum 2 years of software development experience
- Minimum 2 years of experience in IoT/streaming data pipelines solution development
- Bachelor's and/or Master’s degree in computer science
- Strong Consulting skills in data management including data governance, data quality, security, data integration, processing, and provisioning
- Delivered data management projects with real-time/near real-time data insights delivery on Azure Cloud
- Translated complex analytical requirements into the technical design including data models, ETLs, and Dashboards / Reports
- Experience deploying dashboards and self-service analytics solutions on both relational and non-relational databases
- Experience with different computing paradigms in databases such as In-Memory, Distributed, Massively Parallel Processing
- Successfully delivered large scale IOT data management initiatives covering Plan, Design, Build and Deploy phases leveraging different delivery methodologies including Agile
- Experience in handling telemetry data with Spark Streaming, Kafka, Flink, Scala, Pyspark, Spark SQL.
- Hands-on experience on containers and Dockers
- Exposure to streaming protocols like MQTT and AMQP
- Knowledge of OT network protocols like OPC UA, CAN Bus, and similar protocols
- Strong knowledge of continuous integration, static code analysis, and test-driven development
- Experience in delivering projects in a highly collaborative delivery model with teams at onsite and offshore
- Must have excellent analytical and problem-solving skills
- Delivered change management initiatives focused on driving data platforms adoption across the enterprise
- Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations
Roles & Responsibilities
You Will:
- Translate functional requirements into technical design
- Interact with clients and internal stakeholders to understand the data and platform requirements in detail and determine core Azure services needed to fulfill the technical design
- Design, Develop and Deliver data integration interfaces in ADF and Azure Databricks
- Design, Develop and Deliver data provisioning interfaces to fulfill consumption needs
- Deliver data models on Azure platform, it could be on Azure Cosmos, SQL DW / Synapse, or SQL
- Advise clients on ML Engineering and deploying ML Ops at Scale on AKS
- Automate core activities to minimize the delivery lead times and improve the overall quality
- Optimize platform cost by selecting the right platform services and architecting the solution in a cost-effective manner
- Deploy Azure DevOps and CI CD processes
- Deploy logging and monitoring across the different integration points for critical alerts
Global internet of things connected solutions provider(H1)
- Required to work individually or as part of a team on data science projects and work closely with lines of business to understand business problems and translate them into identifiable machine learning problems which can be delivered as technical solutions.
- Build quick prototypes to check feasibility and value to the business.
- Design, training, and deploying neural networks for computer vision and machine learning-related problems.
- Perform various complex activities related to statistical/machine learning.
- Coordinate with business teams to provide analytical support for developing, evaluating, implementing, monitoring, and executing models.
- Collaborate with technology teams to deploy the models to production.
Key Criteria:
- 2+ years of experience in solving complex business problems using machine learning.
- Understanding and modeling experience in supervised, unsupervised, and deep learning models; hands-on knowledge of data wrangling, data cleaning/ preparation, dimensionality reduction is required.
- Experience in Computer Vision/Image Processing/Pattern Recognition, Machine Learning, Deep Learning, or Artificial Intelligence.
- Understanding of Deep Learning Architectures like InceptionNet, VGGNet, FaceNet, YOLO, SSD, RCNN, MASK Rcnn, ResNet.
- Experience with one or more deep learning frameworks e.g., TensorFlow, PyTorch.
- Knowledge of vector algebra, statistical and probabilistic modeling is desirable.
- Proficiency in programming skills involving Python, C/C++, and Python Data Science Stack (NumPy, SciPy, Pandas, Scikit-learn, Jupyter, IPython).
- Experience working with Amazon SageMaker or Azure ML Studio for deployments is a plus.
- Experience in data visualization software such as Tableau, ELK, etc is a plus.
- Strong analytical, critical thinking, and problem-solving skills.
- B.E/ B.Tech./ M. E/ M. Tech in Computer Science, Applied Mathematics, Statistics, Data Science, or related Engineering field.
- Minimum 60% in Graduation or Post-Graduation
- Great interpersonal and communication skills