Cutshort logo
RF Jobs in Bangalore (Bengaluru)

11+ RF Jobs in Bangalore (Bengaluru) | RF Job openings in Bangalore (Bengaluru)

Apply to 11+ RF Jobs in Bangalore (Bengaluru) on CutShort.io. Explore the latest RF Job opportunities across top companies like Google, Amazon & Adobe.

icon
Banyan Data Services

at Banyan Data Services

1 recruiter
Sathish Kumar
Posted by Sathish Kumar
Bengaluru (Bangalore)
3 - 15 yrs
₹6L - ₹20L / yr
skill iconData Science
Data Scientist
skill iconMongoDB
skill iconJava
Big Data
+14 more

Senior Big Data Engineer 

Note:   Notice Period : 45 days 

Banyan Data Services (BDS) is a US-based data-focused Company that specializes in comprehensive data solutions and services, headquartered in San Jose, California, USA. 

 

We are looking for a Senior Hadoop Bigdata Engineer who has expertise in solving complex data problems across a big data platform. You will be a part of our development team based out of Bangalore. This team focuses on the most innovative and emerging data infrastructure software and services to support highly scalable and available infrastructure. 

 

It's a once-in-a-lifetime opportunity to join our rocket ship startup run by a world-class executive team. We are looking for candidates that aspire to be a part of the cutting-edge solutions and services we offer that address next-gen data evolution challenges. 

 

 

Key Qualifications

 

·   5+ years of experience working with Java and Spring technologies

· At least 3 years of programming experience working with Spark on big data; including experience with data profiling and building transformations

· Knowledge of microservices architecture is plus 

· Experience with any NoSQL databases such as HBase, MongoDB, or Cassandra

· Experience with Kafka or any streaming tools

· Knowledge of Scala would be preferable

· Experience with agile application development 

· Exposure of any Cloud Technologies including containers and Kubernetes 

· Demonstrated experience of performing DevOps for platforms 

· Strong Skillsets in Data Structures & Algorithm in using efficient way of code complexity

· Exposure to Graph databases

· Passion for learning new technologies and the ability to do so quickly 

· A Bachelor's degree in a computer-related field or equivalent professional experience is required

 

Key Responsibilities

 

· Scope and deliver solutions with the ability to design solutions independently based on high-level architecture

· Design and develop the big data-focused micro-Services

· Involve in big data infrastructure, distributed systems, data modeling, and query processing

· Build software with cutting-edge technologies on cloud

· Willing to learn new technologies and research-orientated projects 

· Proven interpersonal skills while contributing to team effort by accomplishing related results as needed 

Read more
Bengaluru (Bangalore)
3 - 6 yrs
₹20L - ₹40L / yr
skill iconData Science
Weka
Data Scientist
Statistical Modeling
Mathematics
+5 more
Roles and Responsibilities
● Research and develop advanced statistical and machine learning models for
analysis of large-scale, high-dimensional data.
● Dig deeper into data, understand characteristics of data, evaluate alternate
models and validate hypothesis through theoretical and empirical approaches.
● Productize proven or working models into production quality code.
● Collaborate with product management, marketing and engineering teams in
Business Units to elicit & understand their requirements & challenges and
develop potential solutions
● Stay current with latest research and technology ideas; share knowledge by
clearly articulating results and ideas to key decision makers.
● File patents for innovative solutions that add to company's IP portfolio

Requirements
● 4 to 6 years of strong experience in data mining, machine learning and
statistical analysis.
● BS/MS/PhD in Computer Science, Statistics, Applied Math, or related areas
from Premier institutes (only IITs / IISc / BITS / Top NITs or top US university
should apply)
● Experience in productizing models to code in a fast-paced start-up
environment.
● Expertise in Python programming language and fluency in analytical tools
such as Matlab, R, Weka etc.
● Strong intuition for data and Keen aptitude on large scale data analysis

● Strong communication and collaboration skills.
Read more
Anicaa Data
Agency job
via wrackle by Naveen Taalanki
Bengaluru (Bangalore)
3 - 6 yrs
₹10L - ₹25L / yr
TensorFlow
PyTorch
skill iconMachine Learning (ML)
skill iconData Science
data scientist
+11 more

Job Title – Data Scientist (Forecasting)

Anicca Data is seeking a Data Scientist (Forecasting) who is motivated to apply his/her/their skill set to solve complex and challenging problems. The focus of the role will center around applying deep learning models to real-world applications.  The candidate should have experience in training, testing deep learning architectures.  This candidate is expected to work on existing codebases or write an optimized codebase at Anicca Data. The ideal addition to our team is self-motivated, highly organized, and a team player who thrives in a fast-paced environment with the ability to learn quickly and work independently.

 

Job Location: Remote (for time being) and Bangalore, India (post-COVID crisis)

 

Required Skills:

  • At least 3+ years of experience in a Data Scientist role
  • Bachelor's/Master’s degree in Computer Science, Engineering, Statistics, Mathematics, or similar quantitative discipline. D. will add merit to the application process
  • Experience with large data sets, big data, and analytics
  • Exposure to statistical modeling, forecasting, and machine learning. Deep theoretical and practical knowledge of deep learning, machine learning, statistics, probability, time series forecasting
  • Training Machine Learning (ML) algorithms in areas of forecasting and prediction
  • Experience in developing and deploying machine learning solutions in a cloud environment (AWS, Azure, Google Cloud) for production systems
  • Research and enhance existing in-house, open-source models, integrate innovative techniques, or create new algorithms to solve complex business problems
  • Experience in translating business needs into problem statements, prototypes, and minimum viable products
  • Experience managing complex projects including scoping, requirements gathering, resource estimations, sprint planning, and management of internal and external communication and resources
  • Write C++ and Python code along with TensorFlow, PyTorch to build and enhance the platform that is used for training ML models

Preferred Experience

  • Worked on forecasting projects – both classical and ML models
  • Experience with training time series forecasting methods like Moving Average (MA) and Autoregressive Integrated Moving Average (ARIMA) with Neural Networks (NN) models as Feed-forward NN and Nonlinear Autoregressive
  • Strong background in forecasting accuracy drivers
  • Experience in Advanced Analytics techniques such as regression, classification, and clustering
  • Ability to explain complex topics in simple terms, ability to explain use cases and tell stories
Read more
Expand My Business
Bengaluru (Bangalore), Mumbai, Chennai
7 - 15 yrs
₹10L - ₹28L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+2 more

Design, implement, and improve the analytics platform

Implement and simplify self-service data query and analysis capabilities of the BI platform

Develop and improve the current BI architecture, emphasizing data security, data quality

and timeliness, scalability, and extensibility

Deploy and use various big data technologies and run pilots to design low latency

data architectures at scale

Collaborate with business analysts, data scientists, product managers, software development engineers,

and other BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction,

forecasting, clustering, and machine learning algorithms


Educational

At Ganit we are building an elite team, ergo we are seeking candidates who possess the

following backgrounds:

7+ years relevant experience

Expert level skills writing and optimizing complex SQL

Knowledge of data warehousing concepts

Experience in data mining, profiling, and analysis

Experience with complex data modelling, ETL design, and using large databases

in a business environment

Proficiency with Linux command line and systems administration

Experience with languages like Python/Java/Scala

Experience with Big Data technologies such as Hive/Spark

Proven ability to develop unconventional solutions, sees opportunities to

innovate and leads the way

Good experience of working in cloud platforms like AWS, GCP & Azure. Having worked on

projects involving creation of data lake or data warehouse

Excellent verbal and written communication.

Proven interpersonal skills and ability to convey key insights from complex analyses in

summarized business terms. Ability to effectively communicate with multiple teams


Good to have

AWS/GCP/Azure Data Engineer Certification

Read more
Bengaluru (Bangalore)
2 - 4 yrs
₹12L - ₹16L / yr
skill iconPython
Bash
MySQL
skill iconElastic Search
skill iconAmazon Web Services (AWS)

What are we looking for:

 

  1. Strong experience in MySQL and writing advanced queries
  2. Strong experience in Bash and Python
  3. Familiarity with ElasticSearch, Redis, Java, NodeJS, ClickHouse, S3
  4. Exposure to cloud services such as AWS, Azure, or GCP
  5. 2+ years of experience in the production support
  6. Strong experience in log management and performance monitoring like ELK, Prometheus + Grafana, logging services on various cloud platforms
  7. Strong understanding of Linux OSes like Ubuntu, CentOS / Redhat Linux
  8. Interest in learning new languages / framework as needed
  9. Good written and oral communications skills
  10. A growth mindset and passionate about building things from the ground up, and most importantly, you should be fun to work with

 

As a product solutions engineer, you will:

 

  1. Analyze recorded runtime issues, diagnose and do occasional code fixes of low to medium complexity
  2. Work with developers to find and correct more complex issues
  3. Address urgent issues quickly, work within and measure against customer SLAs
  4. Using shell and python scripts, and use scripting to actively automate manual / repetitive activities
  5. Build anomaly detectors wherever applicable
  6. Pass articulated feedback from customers to the development and product team
  7. Maintain ongoing record of the operation of problem analysis and resolution in a on call monitoring system
  8. Offer technical support needed in development

 

Read more
Accolite Software

at Accolite Software

1 video
3 recruiters
Nikita Sadarangani
Posted by Nikita Sadarangani
Remote, Bengaluru (Bangalore)
3 - 10 yrs
₹5L - ₹24L / yr
skill iconData Science
skill iconR Programming
skill iconPython
skill iconDeep Learning
Neural networks
+3 more
  • Adept at Machine learning techniques and algorithms.

Feature selection, dimensionality reduction, building and

  • optimizing classifiers using machine learning techniques
  • Data mining using state-of-the-art methods
  • Doing ad-hoc analysis and presenting results
  • Proficiency in using query languages such as N1QL, SQL

Experience with data visualization tools, such as D3.js, GGplot,

  • Plotly, PyPlot, etc.

Creating automated anomaly detection systems and constant tracking

  • of its performance
  • Strong in Python is a must.
  • Strong in Data Analysis and mining is a must
  • Deep Learning, Neural Network, CNN, Image Processing (Must)

Building analytic systems - data collection, cleansing and

  • integration

Experience with NoSQL databases, such as Couchbase, MongoDB,

Cassandra, HBase

Read more
Bengaluru (Bangalore)
3 - 5 yrs
₹12L - ₹14L / yr
Data Engineer
Big Data
skill iconPython
skill iconAmazon Web Services (AWS)
SQL
+2 more
  •  We are looking for a Data Engineer with 3-5 years experience in Python, SQL, AWS (EC2, S3, Elastic Beanstalk, API Gateway), and Java.
  • The applicant must be able to perform Data Mapping (data type conversion, schema harmonization) using Python, SQL, and Java.
  • The applicant must be familiar with and have programmed ETL interfaces (OAUTH, REST API, ODBC) using the same languages.
  • The company is looking for someone who shows an eagerness to learn and who asks concise questions when communicating with teammates.
Read more
Freelancer

at Freelancer

4 recruiters
Nirmala Hk
Posted by Nirmala Hk
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹35L / yr
skill iconPython
Shell Scripting
MySQL
SQL
skill iconAmazon Web Services (AWS)
+3 more

   3+ years of experience in deployment, monitoring, tuning, and administration of high concurrency MySQL production databases.

  • Solid understanding of writing optimized SQL queries on MySQL databases
  • Understanding of AWS, VPC, networking, security groups, IAM, and roles.
  • Expertise in scripting in Python or Shell/Powershell
  • Must have experience in large scale data migrations
  • Excellent communication skills.
Read more
Dunzo
Agency job
via zyoin by Pratibha Yadav
Bengaluru (Bangalore)
8 - 12 yrs
₹50L - ₹90L / yr
skill iconData Science
skill iconMachine Learning (ML)
NumPy
skill iconR Programming
skill iconPython
  • B.Tech/MTech from tier 1 institution
  • 8+years of experience in machine learning techniques like logistic regression, random forest, boosting, trees, neural networks, etc.
  • Showcased experience with Python, SQL and proficiency in Scikit Learn, Pandas, NumPy, Keras and TensorFlow/pytorch
  • Experience of working with Qlik sense or Tableau is a plus
Experience of working in a product company is a plus
Read more
Lymbyc

at Lymbyc

1 video
2 recruiters
Venky Thiriveedhi
Posted by Venky Thiriveedhi
Bengaluru (Bangalore), Chennai
4 - 8 yrs
₹9L - ₹14L / yr
Apache Spark
Apache Kafka
Druid Database
Big Data
Apache Sqoop
+5 more
Key skill set : Apache NiFi, Kafka Connect (Confluent), Sqoop, Kylo, Spark, Druid, Presto, RESTful services, Lambda / Kappa architectures Responsibilities : - Build a scalable, reliable, operable and performant big data platform for both streaming and batch analytics - Design and implement data aggregation, cleansing and transformation layers Skills : - Around 4+ years of hands-on experience designing and operating large data platforms - Experience in Big data Ingestion, Transformation and stream/batch processing technologies using Apache NiFi, Apache Kafka, Kafka Connect (Confluent), Sqoop, Spark, Storm, Hive etc; - Experience in designing and building streaming data platforms in Lambda, Kappa architectures - Should have working experience in one of NoSQL, OLAP data stores like Druid, Cassandra, Elasticsearch, Pinot etc; - Experience in one of data warehousing tools like RedShift, BigQuery, Azure SQL Data Warehouse - Exposure to other Data Ingestion, Data Lake and querying frameworks like Marmaray, Kylo, Drill, Presto - Experience in designing and consuming microservices - Exposure to security and governance tools like Apache Ranger, Apache Atlas - Any contributions to open source projects a plus - Experience in performance benchmarks will be a plus
Read more
Computer Power Group Pvt Ltd
Bengaluru (Bangalore), Chennai, Pune, Mumbai
7 - 13 yrs
₹14L - ₹20L / yr
skill iconR Programming
skill iconPython
skill iconData Science
SQL server
Business Analysis
+3 more
Requirement Specifications: Job Title:: Data Scientist Experience:: 7 to 10 Years Work Location:: Mumbai, Bengaluru, Chennai Job Role:: Permanent Notice Period :: Immediate to 60 days Job description: • Support delivery of one or more data science use cases, leading on data discovery and model building activities Conceptualize and quickly build POC on new product ideas - should be willing to work as an individual contributor • Open to learn, implement newer tools\products • Experiment & identify best methods\techniques, algorithms for analytical problems • Operationalize – Work closely with the engineering, infrastructure, service management and business teams to operationalize use cases Essential Skills • Minimum 2-7 years of hands-on experience with statistical software tools: SQL, R, Python • 3+ years’ experience in business analytics, forecasting or business planning with emphasis on analytical modeling, quantitative reasoning and metrics reporting • Experience working with large data sets in order to extract business insights or build predictive models • Proficiency in one or more statistical tools/languages – Python, Scala, R, SPSS or SAS and related packages like Pandas, SciPy/Scikit-learn, NumPy etc. • Good data intuition / analysis skills; sql, plsql knowledge must • Manage and transform variety of datasets to cleanse, join, aggregate the datasets • Hands-on experience running in running various methods like Regression, Random forest, k-NN, k-Means, boosted trees, SVM, Neural Network, text mining, NLP, statistical modelling, data mining, exploratory data analysis, statistics (hypothesis testing, descriptive statistics) • Deep domain (BFSI, Manufacturing, Auto, Airlines, Supply Chain, Retail & CPG) knowledge • Demonstrated ability to work under time constraints while delivering incremental value. • Education Minimum a Masters in Statistics, or PhD in domains linked to applied statistics, applied physics, Artificial Intelligence, Computer Vision etc. BE/BTECH/BSC Statistics/BSC Maths
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort