1. Caffe
It was created by Jia Yangqing's reading at the University of California, Berkeley. Caffe is a deep learning framework based on expressive architecture and scalable code. What made it famous was its speed, which made it popular among researchers and business users. According to its website, it can handle more than 60 million images with only one NVIDIA K40 GPU in a single day. It is managed by the Berkeley Horizons and Learning Center (BVLC) and is funded by companies such as NVIDIA and Amazon to support its development.
2. CNTK
It is an acronym for the Computational Network Toolkit, a computing network toolkit. CNTK is a Microsoft open source artificial intelligence tool. It has excellent performance on a single CPU, a single GPU, multiple GPUs, or multiple machines with multiple GPUs. Microsoft mainly uses it for speech recognition research, but it has good applications in machine translation, image recognition, image subtitling, text processing, language understanding, and language modeling.
3. Deeplearning4j
Deeplearning4j is an open source deep learning library for the Java Virtual Machine (JVM). It runs in a distributed environment and is integrated in Hadoop and Apache Spark. This makes it possible to configure deep neural networks and it is compatible with Java, Scala and other JVM languages.
The project is managed by a commercial company called Skymind, which provides support, training, and a corporate distribution for the project.
4. DMTK
DMTK is an acronym for Distributed Machine Learning Toolkit, a distributed machine learning tool. Like CNTK, it is Microsoft's open source artificial intelligence tool. As an application designed for big data, its goal is to train artificial intelligence systems faster. It consists of three main components: the DMTK framework, the LightLDA topic model algorithm, and the distributed (polysemy) word embedding algorithm. To prove its speed, Microsoft claims that on an eight-cluster machine, it can "train a topic model with 1 million topics and 10 million words of vocabulary (a total of 10 trillion parameters), and collect 1,000 in one document. Billion symbols,". This achievement is unmatched by other tools.
5. H20
Compared with scientific research, H2O pays more attention to serving AI users, so H2O has a large number of corporate customers, such as First Capital Financial, Cisco, Nielsen Catalina, PayPal and Pan-American. It claims that anyone can use the power of machine learning and predictive analytics to solve business problems. It can be used for predictive modeling, risk and fraud analysis, insurance analysis, advertising technology, healthcare, and customer intelligence.
It has two open source versions: Standard Edition H2O and Sparking Water Edition, which is integrated in Apache Spark. There is also paid business user support.
6. Mahout
It is an Apache Foundation project and Mahout is an open source machine learning framework. According to its website, it has three main features: a programming environment for building scalable algorithms, pre-built algorithm tools like Spark and H2O, and a vector mathematics experiment environment called Samsara. Companies using Mahout include Adobe, Accenture Consulting, Foursquare, Intel, LinkedIn, Twitter, Yahoo, and many others. Its website lists professional support from third parties.
7. MLlib
Due to its speed, Apache Spark became one of the most popular big data processing tools. MLlib is Spark's extensible machine learning library. It integrates Hadoop and can interact with NumPy and R. It includes many machine learning algorithms such as classification, regression, decision tree, recommendation, clustering, topic modeling, function conversion, model evaluation, ML pipeline architecture, ML persistence, survival analysis, frequent itemsets and sequential pattern mining, distributed linearity Algebra and statistics.
8. NuPIC
NuPIC, managed by Numenta, is an open source artificial intelligence project based on hierarchical temporal memory Hierarchical Temporal Memory, HTM theory. Essentially, HTM tries to create a computer system that mimics the human cerebral cortex. Their goal is to create a machine that "approaches or exceeds human cognitive ability in many cognitive tasks."
In addition to open source licenses, Numenta also provides NuPic's commercial license agreement, and it also provides technology patent licenses.
9. OpenNN
As a highly understood artificial intelligence designed for developers and researchers, OpenNN is a c++ programming library that implements neural network algorithms. Its key features include deep architecture and fast performance. A wealth of documentation can be found on its website, including an introductory tutorial that explains the basics of neural networks. OpenNN's paid support was provided by a Spanish company, Artelnics, which is engaged in predictive analysis.
10. OpenCyc
OpenCyc, developed by Cycorp, provides access to the Cyc knowledge base and a common sense reasoning engine. It has more than 239,000 entries, approximately 2,093,000 triples, and approximately 69,000 owls: this is a namespace that is similar to links to external semantic libraries. It has a good application in rich domain models, semantic data integration, text understanding, expert systems in special areas, and game AI. The company also offers two other versions of Cyc: one for free for scientific research but not open source, and one for businesses but pays.
11. Oryx 2
Oryx 2 built on Apache Spark and Kafka is an application development framework specifically for large-scale machine learning. It uses a unique three-tier λ architecture. Developers can use Orys 2 to create new applications, and it also has some pre-built applications that can be used for common big data tasks such as collaborative filtering, classification, regression, and clustering. The big data tools provider Cloudera created the original Oryx 1 project and has been actively involved in continuous development.
12. PredictionIO
In February of this year, Salesforce acquired PredictionIO, and in July it contributed the platform and logo to the Apache Foundation, which was listed as an incubation program by the Apache Foundation. So when Salesforce uses PredictionIO technology to enhance its machine learning capabilities, the results will be synchronized in the open source version. It helps users create predictive engines with machine learning capabilities that can be used to deploy Web services that can dynamically query in real time.
13. SystemML
Originally developed by IBM, SystemML is now an Apache big data project. It provides a highly scalable platform for advanced mathematics, and its algorithm is written in R or a python-like syntax. The company is already using it to track vehicle maintenance customer service, plan airport traffic and connect social media data with bank customers. It can be run on Spark or Hadoop.
14. TensorFlow
TensorFlow is a Google open source artificial intelligence tool. It provides a library for numerical calculations using dataflow graphs. It can run on many different systems with single or multiple CPUs and GPUs, and even on mobile devices. It has deep flexibility, true portability, automatic differentiation, and support for Python and C++. Its website has a very detailed list of tutorials to help developers and researchers immerse themselves in using or extending their capabilities.
15. Torch
Torch describes himself as “a scientific computing framework with a wide range of support for machine learning algorithms that favors GPUs first,†which is characterized by flexibility and speed. In addition, it can be easily used for machine learning, computer vision, signal processing, parallel processing, images, video, audio, and networking through software packages. It relies on a scripting language called LuaJIT, and LuaJIT is based on Lua.
Solar Charge Controller ,Solar Controller,Solar Regulator,Solar Panel Controller
zhejiang ttn electric co.,ltd , https://www.ttnpower.com