Skip to content
Snippets Groups Projects
Commit 68878202 authored by ISO's avatar ISO
Browse files

Initial commit

parents
No related branches found
No related tags found
No related merge requests found
This diff is collapsed.
h5py==2.10.0
absl-py==0.10.0
argon2-cffi==20.1.0
astor==0.8.1
astunparse==1.6.3
attrs==20.1.0
backcall==0.2.0
bleach==3.1.5
cachetools==4.1.1
certifi==2020.6.20
cffi==1.14.2
chardet==3.0.4
colorama==0.4.3
cycler==0.10.0
decorator==4.4.2
defusedxml==0.6.0
entrypoints==0.3
gast==0.3.3
google-auth==1.20.1
google-auth-oauthlib==0.4.1
google-pasta==0.2.0
grpcio==1.31.0
h5py==2.10.0
idna==2.10
importlib-metadata==1.7.0
ipykernel==5.3.4
ipython==7.16.1
ipython-genutils==0.2.0
ipywidgets==7.5.1
jedi==0.17.2
Jinja2==2.11.2
joblib==0.16.0
jsonschema==3.2.0
jupyter==1.0.0
jupyter-client==6.1.6
jupyter-console==6.1.0
jupyter-core==4.6.3
Keras==2.4.3
Keras-Applications==1.0.8
Keras-Preprocessing==1.1.2
kiwisolver==1.2.0
Markdown==3.2.2
MarkupSafe==1.1.1
matplotlib==3.3.1
mistune==0.8.4
nbconvert==5.6.1
nbformat==5.0.7
notebook==6.1.3
numpy==1.18.5
oauthlib==3.1.0
opencv-python==4.4.0.42
opt-einsum==3.3.0
packaging==20.4
pandas==1.1.1
pandocfilters==1.4.2
parso==0.7.1
pickleshare==0.7.5
Pillow==7.2.0
prometheus-client==0.8.0
prompt-toolkit==3.0.6
protobuf==3.13.0
pyasn1==0.4.8
pyasn1-modules==0.2.8
pycparser==2.20
Pygments==2.6.1
pyparsing==2.4.7
pyrsistent==0.16.0
python-dateutil==2.8.1
pytz==2020.1
pywin32==228
pywinpty==0.5.7
PyYAML==5.3.1
pyzmq==19.0.2
qtconsole==4.7.6
QtPy==1.9.0
requests==2.24.0
requests-oauthlib==1.3.0
rsa==4.6
scikit-learn==0.21.3
scipy==1.4.1
Send2Trash==1.5.0
six==1.15.0
spectral==0.22.1
tensorboard==2.3.0
tensorboard-plugin-wit==1.7.0
tensorflow==2.3.0
tensorflow-estimator==2.3.0
tensorflow-gpu==1.14.0
termcolor==1.1.0
terminado==0.8.3
testpath==0.4.4
threadpoolctl==2.1.0
tornado==6.0.4
traitlets==4.3.3
urllib3==1.25.10
wcwidth==0.2.5
webencodings==0.5.1
Werkzeug==1.0.1
widgetsnbextension==3.5.1
wrapt==1.12.1
zipp==3.1.0
\ No newline at end of file
metadata:
name: tress_jup
attributes:
editorFree: 'true'
components:
- endpoints:
- name: jupyter
port: 3100
attributes:
type: ide
discoverable: 'false'
path: /
protocol: http
public: 'true'
referenceContent: |
kind: List
items:
- apiVersion: v1
kind: Pod
metadata:
name: ws
labels:
name: ws
spec:
containers:
- name: jupyter
image: 'kosted/maap-esa-jupyterlab:0.0.9'
imagePullPolicy: IfNotPresent
mountSources: true
resources:
limits:
memory: 50000Mi
securityContext:
privileged: true
tolerations:
- key: "GPU"
operator: "Equal"
value: "true"
effect: "NoSchedule"
nodeSelector:
workerType: gpu
type: kubernetes
volumes:
- name: projects
containerPath: /projects
alias: maap-jupyterlab
env:
- value: WS_JUPYTER
name: MACHINE_NAME
- value: /projects
name: JUPYTER_NOTEBOOK_DIR
- value: 97262f0b-d3ca-4492-bcf8-9a0e12bdede8
name: CLIENT_ID
- value: VAL
name: MAAP_ENV_TYPE
apiVersion: 1.0.0
\ No newline at end of file
# This repository recalls using Hyper3dnet for trees classification using the MAAP and hyperspectral data
## Description
At the MAAP stack, the GPU use is enabled.
This code conduct a supervised training on Hyperspectral data from PRISMA mission using the Hyper3Dnet model.
The model is implemented with Tensoflow package and it runs using GPU on the MAAP.
## The workspace
The Jupyter MAAP stack dev file sample in : **dev_file_GPU.txt** allows using the GPU.
This devfile contains a nodeSelector and a toleratino to be able to use the gpu node.
## datasets and packages
First thing to do after creating GPU workspace is to dowload data that are stored in the S3. For this task you should :
*use the S3 script (in windows) :
echo $MAAP_ENV_TYPE
export MAAP_ENV_TYPE=VAL
export CLIENT_ID=97262f0b-d3ca-4492-bcf8-9a0e12bdede8
* Downloading request using maap-s3.py
Since data are stored in : **maap-scientific-data/shared/imane**
list first stored files in that repository by runing :
maap-s3.py list /maap-scientific-data/shared/imane/
Download the image file (.mat) : maap-s3.py download maap-scientific-data/shared/imane/*filename* *path_where_to_save_data*
ex : maap-s3.py download maap-scientific-data/shared/imane/france_region2_data.mat /projects/data/france_region2_data.mat
Download the ground truth file (.mat) : maap-s3.py download maap-scientific-data/shared/imane/*filename* *path_where_to_save_data*
ex : maap-s3.py download maap-scientific-data/shared/imane/france_region2_GT.mat /projects/data/france_region2_GT.mat
 
*The second step would be installing the required *packages* for the code processing in your working environement weither it's the base env or a vir env
The required packages are listed in **requirements text file**
Please run :
**pip install -r requirements.txt**
In your env run : ** conda install -c conda-forge psutil **
## Execution
Followwing the jupyter notebook steps leads to Training the Hyper3Dnet Model with GPU use.
A hint of the cells outputs are in the notebook : **Classiifcation_using_hyper3Dnet.ipynb**
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment