Introduction

Multimodal Keras Wrapper

Wrapper for Keras with support to easy multimodal data and models loading and handling.

You can download and contribute to the code downloading this repository.

Documentation

You can access the library documentation page at marcbs.github.io/multimodal_keras_wrapper/

Some code examples are available in demo.ipynb and test.py. Additionally, in the section Projects you can see some practical examples of projects using this library.

Dependencies

The following dependencies are required for using this library:

Only when using NMS for certain localization utilities:

Installation

In order to install the library you just have to follow these steps:

  1. Clone this repository:
git clone https://github.com/MarcBS/multimodal_keras_wrapper.git
  1. Include the repository path into your PYTHONPATH:
export PYTHONPATH=$PYTHONPATH:/path/to/multimodal_keras_wrapper
  1. If you wish to install the dependencies (it will install our custom Keras fork):
pip install -r requirements.txt

Projects

You can see more practical examples in projects which use this library:

VIBIKNet for Visual Question Answering

ABiViRNet for Video Description

Sentence-SelectioNN for Domain Adaptation in SMT

Keras

For additional information on the Deep Learning library, visit the official web page www.keras.io or the GitHub repository https://github.com/fchollet/keras.

You can also use our custom Keras version, which provides several additional layers for Multimodal Learning.