-
Notifications
You must be signed in to change notification settings - Fork 528
[MRG] flake8 + pimp example figures #13
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hello @agramfort ,
Also I know I use pl instead of plt but it felt so much better at the time (still feels better trust me). Still i understand the need to make the examples more friendly to newcomers. @ncourty how so you feel about all that? |
Hello @agramfort and @rflamary,
|
hey,
so I made progress. I pushed full flake8 valid code.
I'll need you to refactor the tests to they are functions that start with
test_
then we can call SkipTest if cudamat is not installed.
Some notes from my reading:
- why not using cg from scipy.optimize?
- sklearn calls make_ functions that make simulated data. So I propose to
rename get_ -> make_ for datasets
- add references to papers in examples (it's not always there)
- don't put params in fit in the estimators. Put parameters in the init
- don't mix CamelCase and snake_case eg OTDA_mapping_kernel ->
OTDAMappingKernel etc.
- why gpuarray and not cupy which seem more mature? It should also allow
you to have the same code for CPU and GPU code.
feedback welcome.
A
|
Hello, Thank you for the work, it looks great I just have small remarks:
Regarding your remarks:
I'll work on the test next week when I don't have NIPS review to write. |
full flake8 valid is awesome !
:)
you did no change all pl to plt, probably because we did no concrge with
@ncourty. Sp we need to do it quickly I suppose.
yes I used pl so you're still in your confort zone :)
Regarding your remarks:
CG here stands for conditional gradient and not conjugate gradient as
implemented in scipy.
arfff good point. What it tells you is that cg is probably not explicit
enough. Maybe conditional_gradient is not that long to type :)
OK for switch to make but all examples and notebooks have to be updated
also.
arfff. But I thought you were using sphinx-gallery. WIth sphinx-gallery you
should not ship notebooks. They are produced when you build the doc.
Example do not have a proper description yet and I agree we need to
complete it with
I always thought this choice (params in init) in sklearn was weird since
it make it more complicated to perform warm start with a slightly different
regularization (in a reg path for instance) value which is common practice
in ML.
for this we use the warm_start parameter
clf = Classifier(..., warm_start=True)
clf.fit(X, y)
clf.set_param(C=C_new) # or just clf.C = C_new
clf.fit(X, y)
I'm sure you have good reasons for it though. my problem here is that it
breaks the code for people who already use the toolbox and I'd like to keep
the young and still small community.. Maybe do both for a while with a
Deprecated Warning?
yes you need deprecation cycles. But the earlier you do it the better. If
it's too painful just deprecate a full class and create a new one with a
better name.
I agree we should stick to Camel or snake but i personally prefer snake
and most of the toolbox is in snake, do you suggest the change for Classes
only?, is it a common thing to do ?
it's not a matter of preference but conventions :)
yes you should use CamelCase for classes and snake_case for functions and
methods.
I agree that the choice for cudamat is probably not the best. We were
planing with Nico to switch to a more future proof code with pycuda.
pycuda is old and not that maintained anymore. cupy is much more used these
days and more actively maintained.
I'll work on the test next week when I don't have NIPS review to write.
:)
|
Hello Alexandre,
We met with Nico and decided on a few things discussed in the following
2017-07-13 14:44 GMT+02:00 Alexandre Gramfort <notifications@github.com>:
> you did no change all pl to plt, probably because we did no concrge with
@ncourty. Sp we need to do it quickly I suppose.
yes I used pl so you're still in your confort zone :)
OK so we decided to stick to pl since we import pylab and not pyplot and
all
the modules in pot are two letters.
> Regarding your remarks:
>
> CG here stands for conditional gradient and not conjugate gradient as
implemented in scipy.
arfff good point. What it tells you is that cg is probably not explicit
enough. Maybe conditional_gradient is not that long to type :)
ok for renaming the function and also for generalized cg. temporary we
will keep cg=conditional_gradient in order to keep the compatibility and
will put deprecation warning in the future.
> OK for switch to make but all examples and notebooks have to be updated
also.
arfff. But I thought you were using sphinx-gallery. WIth sphinx-gallery you
should not ship notebooks. They are produced when you build the doc.
OK but until recently I did not know how to do proper titles and text in
the python code so that you have a notebook with cells instead of one giant
cell as in most sklearn examples. This will take some time but we will
switch to python only examples as soon as we transcript the text and titles
from the notebooks.
Params in init:
OK for renaming the classes in CamelCase aznd moving the parameters to the
init phase. We will also add a set_param function in order to be more
compatible with sklearn
> I agree that the choice for cudamat is probably not the best. We were
planing with Nico to switch to a more future proof code with pycuda.
pycuda is old and not that maintained anymore. cupy is much more used these
days and more actively maintained.
yeah ok for cupy we will look into that
…
> I'll work on the test next week when I don't have NIPS review to write.
:)
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#13 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ABUpeSNmwe7q4cekiHvJQjzcTgxZGq-fks5sNhEQgaJpZM4OUu6B>
.
|
ok fine with me. So how do you want to proceed with this PR?
I rebased and revert the plt
did you look at the diff?
|
Yeah we are OK with the merge but just wait for travis to merge. Seems from early results that some PEP8 error remain. Thank you again |
all green ! |
@rflamary @ncourty any opinion on this? shall I continue?
big questions are
don't merge yet. Just tell me what you think.