WelcomeOverviewMachine learning typesMachine learning algorithmsStatistical machine learningLinear algebra for machine learningUncertainty quantificationBias variance tradeoffBayesian StatisticsSingular value decompositionOverviewFeature selectionFeature extractionVector embeddingLatent spacePrincipal component analysisLinear discriminant analysisUpsamplingDownsamplingSynthetic dataData leakageOverviewLinear regressionLasso regressionRidge regressionState space modelTime seriesAutoregressive modelOverviewDecision treesK-nearest neighbors (KNNs)Naive bayesRandom forestSupport vector machineLogistic regressionOverviewBoostingBaggingGradient boostingGradient boosting classifierOverviewTransfer learningOverviewOverviewK means clusteringHierarchical clusteringA priori algorithmGaussian mixture modelAnomaly detectionOverviewCollaborative filteringContent based filteringOverviewReinforcement learning human feedbackOverviewOverviewBackpropagationEncoder-decoder modelRecurrent neural networksLong short-term memory (LSTM)Convolutional neural networksOverviewAttention mechanismGrouped query attentionPositional encodingAutoencoderMamba modelGraph neural networkOverviewGenerative modelGenerative AI vs. predictive AIOverviewReasoning modelsSmall language modelsInstruction tuningLLM parametersLLM temperatureLLM benchmarksLLM customizationDiffusion modelsVariational autoencoder (VAE)Generative adversarial networks (GANs)OverviewVision language modelsTutorial: Build an AI stylistTutorial: Multimodal AI queries using LlamaTutorial: Multimodal AI queries using PixtralTutorial: Automatic podcast transcription with GraniteTutorial: PPT AI image analysis answering systemOverviewGraphRAGTutorial: Build a multimodal RAG system with Docling and GraniteTutorial: Evaluate RAG pipline using RagasTutorial: RAG chunking strategiesTutorial: Graph RAG using knowledge graphsTutorial: Inference scaling to improve multimodal RAGOverviewVibe codingVisit the 2025 Guide to AI AgentsLLM trainingOverviewLoss functionTraining dataModel parametersGradient descentStochastic gradient descentHyperparameter tuningLearning rateOverviewParameter efficient fine tuning (PEFT)LoRATutorial: Fine tuning Granite model with LoRARegularizationFoundation modelsOverfittingUnderfittingFew shot learningZero shot learningKnowledge distillationMeta learningData augmentationCatastrophic forgettingOverviewScikit-learnXGboostPyTorchOverviewAI lifecyleAI inferenceModel deploymentMachine learning pipelineData labelingModel risk managementModel driftAutoMLModel selectionFederated learningDistributed machine learningAI stackOverviewNatural language understandingOverviewSentiment analysisTutorial: Spam text classifier with PyTorchMachine translationOverviewInformation retrievalInformation extractionTopic modelingLatent semantic analysisLatent Dirichlet AllocationNamed entity recognitionWord embeddingsBag of wordsIntelligent searchSpeech recognitionStemming and lemmatizationText summarizationConversational AIConversational analyticsNatural language generationOverviewImage classificationObject detectionInstance segmentationSemantic segmentationOptical character recognitionImage recognitionVisual inspectionDave Bergmannmachine learningAI modelartificial intelligencetraining datamachine learninggenerative AIdatasetmodel trainingmodel’s parametershyperparametersforecasting modellarge language model (LLM)tokenmodel trainingsupervised learningloss functionmodel parametersReinforcement learningbackward passfine-tunedIBM Privacy Statementneural networksdeep learningchatbotDeploymentWatch the seriesGPUsAPINeural processing units (NPUs)Field-programmable gate arrays (FPGAs)vLLMvision-language models (VLMs)vector embeddingDave BergmannEbook
Data science and MLOps for data leaders
Align with other leaders on the 3 key goals of MLOps and trustworthy AI: trust in data, trust in models and trust in processes.
Read the ebookTechsplainers Podcast
MLOps explained
Techsplainers by IBM breaks down the essentials of MLOps, from key concepts to real‑world use cases. Clear, quick episodes help you learn the fundamentals fast.
Listen nowAI models
Explore IBM Granite
IBM® Granite® is our family of open, performant and trusted AI models, tailored for business and optimized to scale your AI applications. Explore language, code, time series and guardrail options.
Meet GraniteEbook
Unlock the power of generative AI and ML
Learn how to incorporate generative AI, machine learning and foundation models into your business operations for improved performance.
Read the ebookEbook
How to choose the right foundation model
Learn how to select the most suitable AI foundation model for your use case.
Read the ebookExplainer
What is machine learning?
Machine learning is a branch of AI and computer science that focuses on using data and algorithms to enable AI to imitate the way that humans learn.
Read the articleGuide
How to thrive in this new era of AI with trust and confidence
Dive into the 3 critical elements of a strong AI strategy: creating a competitive edge, scaling AI across the business and advancing trustworthy AI.
Read the guideExplore watsonx.aiExplore AI development toolsExplore AI servicesExplore watsonx.aiBook a live demo“Why Companies Are Vastly Underprepared For The Risks Posed By AI”“Onshoring Semiconductor Production: National Security Versus Economic Efficiency”
智能索引记录
-
2026-03-06 14:51:59
综合导航
成功
标题:关于写童话故事的作文(精选29篇)
简介:编者按:如果大家觉得内容不错,记得分享给你的小伙伴们哦!内容简介:猫和老鼠很久很久以前,猫并不吃小老鼠的,后来猫为什么要
-
2026-03-06 19:50:40
电商商城
成功
标题:护肤品黄油怎么样 - 京东
简介:京东是专业的护肤品黄油网上购物商城,为您提供护肤品黄油价格图片信息、护肤品黄油怎么样的用户评论、护肤品黄油精选导购、更多
-
2026-03-06 03:14:11
图片素材
成功
标题:暖流的作文700字 描写暖流的作文 关于暖流的作文-作文网
简介:作文网精选关于暖流的700字作文,包含暖流的作文素材,关于暖流的作文题目,以暖流为话题的700字作文大全,作文网原创名师
-
2026-03-06 07:24:59
综合导航
成功
标题:Educating future energy professionals by investing in digital training Schneider Electric
简介:Discover how Schneider Electric is investing in digital trai
-
2026-03-06 16:03:27
教育培训
成功
标题:小学作文300字范例[3篇]
简介:在日常生活或是工作学习中,大家对作文都再熟悉不过了吧,作文是人们把记忆中所存储的有关知识、经验和思想用书面形式表达出来的