Pinned Repositories
agent-book
aistudio-python-quickstart-sample
Quickstart Python sample for getting started using the Azure AI Studio with the SDK or CLI options
azure-docs
Open source documentation of Microsoft Azure
bedrock-book
書籍「Amazon Bedrock 生成AIアプリ開発入門」のサンプルコード
cognitive-services-speech-sdk
Sample code for the Microsoft Cognitive Services Speech SDK
deep-learning-from-scratch
『ゼロから作る Deep Learning』(O'Reilly Japan, 2016)
deep-learning-from-scratch-2
『ゼロから作る Deep Learning ❷』(O'Reilly Japan, 2018)
deep-learning-from-scratch-4
『ゼロから作る Deep Learning ❹』(O'Reilly Japan, 2022)
deep-learning-from-scratch-5
『ゼロから作る Deep Learning ❺』(O'Reilly Japan, 2024)
DeepSpeedFugaku
RubberGoose246's Repositories
RubberGoose246/agent-book
RubberGoose246/aistudio-python-quickstart-sample
Quickstart Python sample for getting started using the Azure AI Studio with the SDK or CLI options
RubberGoose246/azure-docs
Open source documentation of Microsoft Azure
RubberGoose246/bedrock-book
書籍「Amazon Bedrock 生成AIアプリ開発入門」のサンプルコード
RubberGoose246/cognitive-services-speech-sdk
Sample code for the Microsoft Cognitive Services Speech SDK
RubberGoose246/deep-learning-from-scratch
『ゼロから作る Deep Learning』(O'Reilly Japan, 2016)
RubberGoose246/deep-learning-from-scratch-2
『ゼロから作る Deep Learning ❷』(O'Reilly Japan, 2018)
RubberGoose246/deep-learning-from-scratch-4
『ゼロから作る Deep Learning ❹』(O'Reilly Japan, 2022)
RubberGoose246/deep-learning-from-scratch-5
『ゼロから作る Deep Learning ❺』(O'Reilly Japan, 2024)
RubberGoose246/DeepSpeedFugaku
RubberGoose246/election2024
東京都知事選2024のリポジトリ
RubberGoose246/function-calling-using-amazon-bedrock-anthropic-claude-3
Function calling using Amazon Bedrock with Anthropic Claude 3 foundation model
RubberGoose246/langchain-book
書籍「LangChain完全入門」で作成するソースコードです。
RubberGoose246/llm-book
「大規模言語モデル入門」(2023)と「大規模言語モデル入門Ⅱ〜生成型LLMの実装と評価」(2024)のGitHubリポジトリ
RubberGoose246/ML-For-Beginners
RubberGoose246/notebooks
Jupyter notebooks for the Natural Language Processing with Transformers book
RubberGoose246/swan
This project aims to enable language model inference on FPGAs, supporting AI applications in edge devices and environments with limited resources.