Skip to content

[MRG] Laplace regularized OTDA #140

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 35 commits into from
Apr 20, 2020
Merged

[MRG] Laplace regularized OTDA #140

merged 35 commits into from
Apr 20, 2020

Conversation

ievred
Copy link
Contributor

@ievred ievred commented Apr 3, 2020

  • added emd with laplacian regularization, test to tests,
  • laplacian matrix calculation to utils
  • added example
  • update readme for new POT urml

@rflamary rflamary changed the title Laplace da [WIP] Laplace da Apr 7, 2020
@rflamary rflamary changed the title [WIP] Laplace da [WIP] Laplace regularized OTDA Apr 7, 2020
# -*- coding: utf-8 -*-
"""
========================
OT for domain adaptation
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add proper titke since it sill appear in the documentation

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

References
----------

.. [5] N. Courty; R. Flamary; D. Tuia; A. Rakotomamonjy,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please add the paper that provide the expression and gradients here and in the readme.

You should also update the readme at the line:
Optimal transport for domain adaptation with group lasso regularization [5]

to add Laplacian regularization and also add yourself as a contributor ;)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added Laplacian regularization to readme and workshop citation to da.py in both emd_laplace and EMDLaplace and to readme

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and myself to contributors

Copy link
Collaborator

@rflamary rflamary left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice PR,
I would add the option to select between position and displacement regularization.

ot/da.py Outdated
+ (1 - alpha) * np.trace(np.dot(xs.T, np.dot(G, np.dot(lT, np.dot(G.T, xs)))))

def df(G):
return alpha * np.dot(lS + lS.T, np.dot(G, np.dot(xt, xt.T)))\
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pre compute (lS + lS.T) and np.dot(xt, xt.T) (lT + lT.T)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

lS = laplacian(sS)
lT = laplacian(sT)

def f(G):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you please implement both position regularization and displacement regularization?

It only change the M matrix and provides more generality to teh function?

See equation (11) vs (9)
https://remi.flamary.com/biblio/flamary2014optlaplace.pdf

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

Copy link
Collaborator

@rflamary rflamary left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry one last review:

you cannot at the moment pass the parameters for the sim (nn otr rbparam) to the class which cleraly limits its application.

Please add this possibility (for instance by storing kwargs..)

ot/da.py Outdated
in NIPS Workshop on Optimal Transport and Machine Learning OTML, 2014.
"""

def __init__(self, reg_type='pos', reg_lap=1., reg_src=1., alpha=0.5,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

how can you pass nn or sim to the emd_laplace function?
this is necessary, you should use a kwargs and store it for the fit

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

added dict kwargs in fit that is passed to emd_laplace

xs : np.ndarray (ns,d)
samples in the source domain
xt : np.ndarray (nt,d)
samples in the target domain
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sim not documented. You should add it to the list and define also the parameters for each sim : 'nn' and the gaussian kernel width.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

documented sim and added a common parameter 'similarity_param'

ot/da.py Outdated
if sim == 'gauss':
if 'rbfparam' not in kwargs:
kwargs['rbfparam'] = 1 / (2 * (np.mean(dist(xs, xs, 'sqeuclidean')) ** 2))
sS = kernel(xs, xs, method=kwargs['sim'], sigma=kwargs['rbfparam'])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do not use kwargs['sim'], use 'gauss'

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left it in case one day another kernel wille be added to kernel but can fix it

@rflamary rflamary changed the title [WIP] Laplace regularized OTDA [MRG] Laplace regularized OTDA Apr 20, 2020
@rflamary rflamary merged commit 43b2190 into PythonOT:master Apr 20, 2020
@rflamary rflamary mentioned this pull request Apr 23, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants