WelcomeOverviewMachine learning typesMachine learning algorithmsStatistical machine learningLinear algebra for machine learningData visualization for machine learningUncertainty quantificationBias variance tradeoffBayesian StatisticsSingular value decompositionOverviewFeature selectionFeature extractionVector embeddingLatent spacePrincipal component analysisLinear discriminant analysisUpsamplingDownsamplingSynthetic dataData leakageOverviewLinear regressionLasso regressionRidge regressionState space modelTime seriesAutoregressive modelOverviewDecision treesK-nearest neighbors (KNNs)Naive bayesRandom forestSupport vector machineLogistic regressionOverviewBoostingBaggingGradient boostingGradient boosting classifierOverviewTransfer learningOverviewOverviewK means clusteringHierarchical clusteringA priori algorithmGaussian mixture modelAnomaly detectionOverviewCollaborative filteringContent based filteringOverviewReinforcement learning human feedbackDeep reinforcement learningOverviewOverviewBackpropagationEncoder-decoder modelRecurrent neural networksLong short-term memory (LSTM)Convolutional neural networksOverviewAttention mechanismGrouped query attentionPositional encodingAutoencoderMamba modelGraph neural networkOverviewGenerative modelGenerative AI vs. predictive AIOverviewReasoning modelsSmall language modelsInstruction tuningLLM parametersLLM temperatureLLM benchmarksLLM customizationLLM alignmentTutorial: Multilingual LLM agentDiffusion modelsVariational autoencoder (VAE)Generative adversarial networks (GANs)OverviewVision language modelsTutorial: Build an AI stylistTutorial: Multimodal AI queries using LlamaTutorial: Multimodal AI queries using PixtralTutorial: Automatic podcast transcription with GraniteTutorial: PPT AI image analysis answering systemOverviewGraphRAGTutorial: Build a multimodal RAG system with Docling and GraniteTutorial: Evaluate RAG pipline using RagasTutorial: RAG chunking strategiesTutorial: Graph RAG using knowledge graphsTutorial: Inference scaling to improve multimodal RAGOverviewVibe codingVisit the 2025 Guide to AI AgentsOverviewLLM trainingLoss functionTraining dataModel parametersOverviewGradient descentStochastic gradient descentHyperparameter tuningLearning rateOverviewParameter efficient fine tuning (PEFT)LoRATutorial: Fine tuning Granite model with LoRARegularizationFoundation modelsOverfittingUnderfittingFew shot learningZero shot learningKnowledge distillationMeta learningData augmentationCatastrophic forgettingOverviewScikit-learnXGboostPyTorchOverviewAI lifecyleAI inferenceModel deploymentMachine learning pipelineData labelingModel risk managementModel driftAutoMLModel selectionFederated learningDistributed machine learningAI stackOverviewNatural language understandingOverviewSentiment analysisTutorial: Spam text classifier with PyTorchMachine translationOverviewInformation retrievalInformation extractionTopic modelingLatent semantic analysisLatent Dirichlet AllocationNamed entity recognitionWord embeddingsBag of wordsIntelligent searchSpeech recognitionStemming and lemmatizationText summarizationConversational AIConversational analyticsNatural language generationOverviewImage classificationObject detectionInstance segmentationSemantic segmentationOptical character recognitionImage recognitionVisual inspectionDave BergmannCole Strykermachine learningArtificial intelligence (AI) modelslinear regressionneural networksdeep learningTraining modelsnatural language processing (NLP)IBM Privacy Statement3-dimensional tensors used to represent color imagesWatch all episodes of Mixture of ExpertsMNIST data setdimensionality reductionlower-dimensional space that omits irrelevant or redundant informationoverfittingautoencodersconvolutionsprincipal component analysisfeature engineeringconvolutional neural networks (CNNs)image segmentationfew-shot learningzero-shot learningword embeddingfine-tuningConvolutionsvariational autoencoders (VAEs)diffusion modelsupervised learningself-supervised learningBERTSiamese neural networkfine-tunedknowledge distillationWord embeddingsrecurrent neural networks (RNNs)transformer-based architecturesunsupervised learningVector databasesIBM® watsonx.data™nearest-neighborretrieval augmented generation (RAG),Ebook
Data science and MLOps for data leaders
Join forces with other leaders to drive the three essential pillars of MLOps and trustworthy AI: trust in data, trust in models and trust in processes.
Read the ebookTraining
Level up your ML expertise
Learn fundamental concepts and build your skills with hands-on labs, courses, guided projects, trials and more.
Explore ML coursesEbook
Unlock the power of generative AI + ML
Learn how to confidently incorporate generative AI and machine learning into your business.
Read the ebookTechsplainers Podcast
Machine learning explained
Techsplainers by IBM breaks down the essentials of machine learning, from key concepts to real‑world use cases. Clear, quick episodes help you learn the fundamentals fast.
Listen nowGuide
Put AI to work: Driving ROI with gen AI
Want to get a better return on your AI investments? Learn how scaling gen AI in key areas drives change by helping your best minds build and deliver innovative new solutions.
Read the guideEbook
How to choose the right foundation model
Learn how to select the most suitable AI foundation model for your use case.
Read the ebookAI models
Explore IBM Granite
IBM® Granite® is our family of open, performant and trusted AI models, tailored for business and optimized to scale your AI applications. Explore language, code, time series and guardrail options.
Meet GraniteGuide
How to thrive in this new era of AI with trust and confidence
Dive into the 3 critical elements of a strong AI strategy: creating a competitive edge, scaling AI across the business and advancing trustworthy AI.
Read the guideExplore watsonx OrchestrateExplore AI development toolsExplore AI servicesExplore watsonx OrchestrateExplore watsonx.ai"Stable Tuple Embeddings for Dynamic Databases" [Stable tuple embeddings for dynamic databases]"Leaderboard: Image Classification on ImageNet" [Image classification on ImageNet]"Models" [Models] (sorted by "Most downloads")"DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter" [DistilBERT, a distilled version of BERT: smaller, faster, cheaper, and lighter]"GloVe: Global Vectors for Word Representation" [GloVe: global vectors for word representation]
智能索引记录
-
2026-04-28 20:34:05
综合导航
成功
标题:Home Health Care Products & Services - PR.com
简介:View products and services from Home Health Care businesses.
-
2026-04-25 12:38:55
综合导航
成功
标题:Bienvenue dans l'espace investisseur EQS Group
简介:Les actualités et les dates, le cours actuel des actions, le
-
2026-05-01 20:16:00
综合导航
成功
标题:Eagle Eye VMS – Smart Video Surveillance for Education
简介:Enhance campus safety with Eagle Eye Networks
-
2026-04-10 23:57:49
综合导航
成功
标题:海贼之海军鬼神_不急躁爱海豹_第八十二章:天龙祭(一)_全本小说网
简介:全本小说网提供海贼之海军鬼神(不急躁爱海豹)第八十二章:天龙祭(一)在线阅读,所有小说均免费阅读,努力打造最干净的阅读环
-
2026-04-15 06:27:14
综合导航
成功
标题:奥特宿敌,但白月光万人迷_安静喝茶_290 观测局if:所有人都重生了除了我(1-7)_全本小说网
简介:全本小说网提供奥特宿敌,但白月光万人迷(安静喝茶)290 观测局if:所有人都重生了除了我(1-7)在线阅读,所有小说均