Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multiprocess functionality does not play way with generators #125

Closed
jpellerin opened this issue Dec 14, 2011 · 99 comments
Closed

multiprocess functionality does not play way with generators #125

jpellerin opened this issue Dec 14, 2011 · 99 comments
Assignees

Comments

@jpellerin
Copy link
Member

What steps will reproduce the problem?

1.create a simple test generator function like:

def test_foo():
for i in range(10):
yield foosleep, i

def foosleep(n):
time.sleep(4)

What is the expected output? What do you see instead?

Using --processes=10, all tests should complete in ~4s.
Instead, it takes 40s since the next test does not start until the last is finished.

What version of the product are you using? On what operating system?

1.0.0 on ubuntu 10.04 x86-64

Please provide any additional information below.

i'm guessing we need a way to collect all tests before they are sent to the parallelizer?

Google Code Info:
Issue #: 399
Author: [email protected]
Created On: 2011-02-19T09:04:15.000Z
Closed On: 2011-05-02T19:59:00.000Z

@ghost ghost assigned jpellerin Dec 14, 2011
@jpellerin
Copy link
Member Author

After playing around with the multiprocess.py file a bit, I got a hacked up version to add tests from generates back to the queue. This was a bit tricky because function args had to be also passed. I'm attaching the new multiprocess.py file and its diff, hopefully you guys have a better way of implementing this.

Google Code Info:
Author: [email protected]
Created On: 2011-02-20T16:06:50.000Z

@jpellerin
Copy link
Member Author

Google Code Info:
Author: [email protected]
Created On: 2011-02-20T16:07:31.000Z

@jpellerin
Copy link
Member Author

i forgot to mention that the biggest change that allowed this to happen is overriding the ContextSuite.run method with an implementation that queues up self._tests to testQueue if len(self._tests) > 1.

rosen diankov,

Google Code Info:
Author: [email protected]
Created On: 2011-02-20T16:19:34.000Z

@jpellerin
Copy link
Member Author

this fixes several bugs when passing the results back to the main process.

rosen dainkov,

Google Code Info:
Author: [email protected]
Created On: 2011-02-21T02:59:43.000Z

@jpellerin
Copy link
Member Author

this fixes bugs when the generator function throws an exception

rosen diankov,

Google Code Info:
Author: [email protected]
Created On: 2011-02-21T04:37:57.000Z

@jpellerin
Copy link
Member Author

It turns out that the timeout feature of the multiprocess plugin was also broken. My expectation was that a timeout of 1s would immediately stop all tests that exceeded 1s value. But, the current version of multiprocess just put time outs on the queues, and just printed "timed out" messages, which is not so useful.

I'm attaching yet more patches, that sends SIGINT to each of the workers and when a timeout occurs, they gracefully stop their current test and return a failure.

Google Code Info:
Author: [email protected]
Created On: 2011-02-21T09:00:47.000Z

@jpellerin
Copy link
Member Author

one more small change: the time out failures now return a multiprocess.TimedOutException exception to be easily identifiable

rosen diankov,

Google Code Info:
Author: [email protected]
Created On: 2011-02-21T09:17:49.000Z

@jpellerin
Copy link
Member Author

Instead of posting new versions, you can find the latest multiprocess.py with these fixes here:

https://openrave.svn.sourceforge.net/svnroot/openrave/trunk/test/noseplugins/multiprocess.py

rosen diankov,

Google Code Info:
Author: [email protected]
Created On: 2011-02-28T06:21:42.000Z

@jpellerin
Copy link
Member Author

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-03-20T18:00:00.000Z

@jpellerin
Copy link
Member Author

Hello. Thanks for following up with the patches. One minor comment: please avoid decorators like @staticmethod so that the code will compile without syntax errors in older pythons (even though this plugin won't be used). You can just double it up like address = staticmethod(address).

Currently with this patch I get this failure in tox -e py26

ERROR: test_multiprocess.test_mp_process_args_pickleable

Traceback (most recent call last):
File "/Users/kumar/dev/nose/nose/case.py", line 187, in runTest
self.test(*self.arg)
File "/Users/kumar/dev/nose/unit_tests/test_multiprocess.py", line 54, in test_mp_process_args_pickleable
runner.run(test)
File "/Users/kumar/dev/nose/nose/plugins/multiprocess.py", line 298, in run
pickle.dumps(self.config)))
File "/Users/kumar/dev/nose/unit_tests/test_multiprocess.py", line 23, in init
self.pickled = pickle.dumps(pargs)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/pickle.py", line 1366, in dumps
Pickler(file, protocol).dump(obj)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/pickle.py", line 224, in dump
self.save(obj)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/pickle.py", line 562, in save_tuple
save(element)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/pickle.py", line 306, in save
rv = reduce(self.proto)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/multiprocessing/sharedctypes.py", line 185, in reduce
assert_spawning(self)
File "/usr/local/Cellar/python2.6/2.6.5/lib/python2.6/multiprocessing/forking.py", line 25, in assert_spawning
' through inheritance' % type(self).name
RuntimeError: Synchronized objects should only be shared between processes through inheritance

Also, can you add a test case using the generator example above? If you get stuck, I can help with that.

To run the test suite, install tox and type tox -e py26 at the root of the nose source. http://codespeak.net/tox/

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-03-20T18:19:09.000Z

@jpellerin
Copy link
Member Author

this seems like a simple problem to solve, i'll patch it in a day or two and test with tox.

about staticmethod, i'll just make them regular methods that don't use self

about decorator test case...... i looked at the test cases for the current multiprocess plugin and it will take me some time to figure things out and write a correct test case. perhaps you could help?

Google Code Info:
Author: [email protected]
Created On: 2011-03-20T18:26:11.000Z

@jpellerin
Copy link
Member Author

hi kumar,

it doesn't look like you were using the latest code from:

https://openrave.svn.sourceforge.net/svnroot/openrave/trunk/test/noseplugins/multiprocess.py

Google Code Info:
Author: [email protected]
Created On: 2011-03-21T04:11:56.000Z

@jpellerin
Copy link
Member Author

Hi Rosen. I tried out this link but I got the same pickle test failure as above. That's the main thing to fix since I'm not entirely sure why that fails.

As for creating a regression test, I can try to make one so consider that low priority if you get stuck on it.

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-03-21T15:12:29.000Z

@jpellerin
Copy link
Member Author

adding empty comment to trick groups into resending previous messages which I accidentally moderated into the trash.

Google Code Info:
Author: [email protected]
Created On: 2011-03-21T15:16:30.000Z

@jpellerin
Copy link
Member Author

ok, this took me a while to figure out, but the problem was not in the new multiprocess.py plugin, it was in the test_multiprocess.py test. The original one was ignoring the first 4 parameters when pickling (these are the sync primitives). The new one has increased sync primitives, so it has to ignore the first 7 parameters when pickling. It was also not popping stuff off the testQueue, causing an infinite loop. I'm attaching the new test_multiprocess.py

You should get the latest multiprocess.py from the url above (doc updates).

Google Code Info:
Author: [email protected]
Created On: 2011-03-25T05:45:29.000Z

@jpellerin
Copy link
Member Author

Hi Rosen, thanks for digging into the tests. This patch is almost there. The python 2.6 tests are fixed but when I patched with the latest code from your svn link above I got a failure in python 3 that I don't understand. Can you take a look?

I cleaned up your patch for PEP8 and other minor things so please submit a diff against my fork here:
https://bitbucket.org/kumar303/nose-multi

Are you able to run tox -e py32?

(also same failure with py31)

http://pastebin.com/hGhJyVjj

Traceback (most recent call last):
  ...
  File "/Users/kumar/dev/nose/build/tests/nose/plugins/multiprocess.py", line 318, in run
    currentaddr = Array('c',' '*1000)
  File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/__init__.py", line 256, in Array
    return Array(typecode_or_type, size_or_initializer, **kwds)
  File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/sharedctypes.py", line 110, in Array
    obj = RawArray(typecode_or_type, size_or_initializer)
  File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/sharedctypes.py", line 87, in RawArray
    result.__init__(*size_or_initializer)
TypeError: one character string expected

I don't know why Array('c',' '*1000) works in 2.6 but not 3. Can you think of a workaround?

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-03-27T17:59:56.000Z

@jpellerin
Copy link
Member Author

that's weird, i just tried it on python 3.1 without problems:

{{{
[TOX] py31: commands succeeded
[TOX] congratulations :)
}}}

And Array does work:

{{{
$ python3.1
Python 3.1.2 (r312:79147, Sep 27 2010, 09:57:50)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.

import multiprocessing
multiprocessing.Array('c',' '*1000)
<SynchronizedString wrapper for <multiprocessing.sharedctypes.c_char_Array_1000 object at 0xd89560>>
}}}

Unfortunately my linux distro does not have a python 3.2 package so i can't test it...

Google Code Info:
Author: [email protected]
Created On: 2011-03-28T04:15:06.000Z

@jpellerin
Copy link
Member Author

oh, it does look like this problem is limited to Python 3.2 actually (my bad).

On Linux, you can just download the Python 3 source and type ./configure && make && make install -- it will not clobber your existing python and you'll get a binary as /usr/local/bin/python3.2 You can pass a flag to configure to install it in a custom place for temporary use. For it to work in tox all you have to do is make sure python3.2 is on $PATH

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-03-28T16:00:45.000Z

@jpellerin
Copy link
Member Author

Sorry for the late response, I think i found a common ground. here are the patches

Google Code Info:
Author: [email protected]
Created On: 2011-04-01T08:28:41.000Z

@jpellerin
Copy link
Member Author

Hi Rosen, this is still failing and actually I don't see any differences in your patch. I had to apply it by hand though because you didn't diff against the branch I'm working in: https://bitbucket.org/kumar303/nose-multi (maybe I missed something?)

Can you clone that branch above, run tox -e py32, and take a look? I get the same thing:

ERROR: test_multiprocess.test_mp_process_args_pickleable

Traceback (most recent call last):
File "/Users/kumar/dev/nose-multi/build/tests/nose/case.py", line 188, in runTest
self.test(_self.arg)
File "/Users/kumar/dev/nose-multi/build/tests/unit_tests/test_multiprocess.py", line 58, in test_mp_process_args_pickleable
runner.run(test)
File "/Users/kumar/dev/nose-multi/build/tests/nose/plugins/multiprocess.py", line 318, in run
currentaddr = Array('c',' '_1000)
...
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/sharedctypes.py", line 87, in RawArray
result.init(*size_or_initializer)
TypeError: one character string expected

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-08T22:19:28.000Z

@jpellerin
Copy link
Member Author

i just noticed that the patch is in reverse! ;0)

the correct code is:
{{{
... Array('c',1000)
... value = b''
}}}

Google Code Info:
Author: [email protected]
Created On: 2011-04-09T01:44:14.000Z

@jpellerin
Copy link
Member Author

You can apply reverse patches by passing the '-R' option to patch

Google Code Info:
Author: [email protected]
Created On: 2011-04-09T01:47:28.000Z

@jpellerin
Copy link
Member Author

After some tweaks to make multiprocessing use byte values in all versions of Python, the tests are finally passing.

Can you try installing Nose from the branch to see if it works in some real world multiprocessing suites?

pip install -e hg+https://bitbucket.org/kumar303/nose-multi#egg=nose

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-18T16:28:25.000Z

@jpellerin
Copy link
Member Author

If you made the exact changes i gave you, then it shouldn't be a problem. i've already tested them.

Using the command you gave gives an error:

{{{
rdiankov@rdiankov-laptop:~/python-nose-test$ pip install -e hg+https://bitbucket.org/kumar303/nose-multi#egg=nose
Checking out nose from hg+https://bitbucket.org/kumar303/nose-multi#egg=nose checkout from hg+https://bitbucket.org/kumar303/nose-multi#egg=nose
Cloning hg https://bitbucket.org/kumar303/nose-multi to ./src/nose
Running setup.py egg_info for package nose
no previously-included directories found matching 'doc/.build'
Exception:
Traceback (most recent call last):
File "/usr/lib/python2.6/dist-packages/pip.py", line 252, in main
self.run(options, args)
File "/usr/lib/python2.6/dist-packages/pip.py", line 408, in run
requirement_set.install_files(finder, force_root_egg_info=self.bundle)
File "/usr/lib/python2.6/dist-packages/pip.py", line 1786, in install_files
finder.add_dependency_links(req_to_install.dependency_links)
File "/usr/lib/python2.6/dist-packages/pip.py", line 1426, in dependency_links
return self.egg_info_lines('dependency_links.txt')
File "/usr/lib/python2.6/dist-packages/pip.py", line 1405, in egg_info_lines
data = self.egg_info_data(filename)
File "/usr/lib/python2.6/dist-packages/pip.py", line 1376, in egg_info_data
filename = self.egg_info_path(filename)
File "/usr/lib/python2.6/dist-packages/pip.py", line 1400, in egg_info_path
assert len(filenames) == 1, "Unexpected files/directories in %s: %s" % (base, ' '.join(filenames))
AssertionError: Unexpected files/directories in /home/rdiankov/python-nose-test/src/nose: /home/rdiankov/python-nose-test/src/nose/nose.egg-info /home/rdiankov/python-nose-test/src/nose/functional_tests/support/ep/Some_plugin.egg-info

Storing complete log in ./pip-log.txt
}}}

Google Code Info:
Author: [email protected]
Created On: 2011-04-19T01:22:11.000Z

@jpellerin
Copy link
Member Author

What version of pip do you have? Mine is 0.8.1 and I do not get that error using python 2.6. Judging from pip.py it looks like you might have a really old version.

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-19T17:13:12.000Z

@jpellerin
Copy link
Member Author

you were right, it as too old. just updated to pip 1.0 and tested with no problems. awesome work!

Google Code Info:
Author: [email protected]
Created On: 2011-04-19T17:30:35.000Z

@jpellerin
Copy link
Member Author

Excellent, thanks for testing. This has been merged into nose's main repo and will go out in 1.0.1

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-19T18:52:05.000Z

@jpellerin
Copy link
Member Author

Hmm, it seems that test_mp_process_args_pickleable() gets caught in an infinite loop but not all of the time. Can you take a look? try running it in a while loop so that you can catch when it gets stuck:

while true; do tox -e py32; done

After KeyboardInterrupt I see the following traceback. I added some debugging and it appears that the task is timing out without sending any results. I upped the timeout but it doesn't fix it.

Note that I am skipping this test for now so that our CI server doesn't explode.

Traceback (most recent call last):
File "selftest.py", line 59, in
nose.run_exit()
File "/Users/kumar/dev/nose/build/tests/nose/core.py", line 118, in init
*_extra_args)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/unittest/main.py", line 124, in init
self.runTests()
File "/Users/kumar/dev/nose/build/tests/nose/core.py", line 197, in runTests
result = self.testRunner.run(self.test)
File "/Users/kumar/dev/nose/build/tests/nose/core.py", line 61, in run
test(result)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 177, in call
return self.run(_arg, *_kw)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 224, in run
test(orig)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/unittest/suite.py", line 62, in call
return self.run(_args, *_kwds)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 75, in run
test(result)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 177, in call
return self.run(_arg, *_kw)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 224, in run
test(orig)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 177, in call
return self.run(_arg, *_kw)
File "/Users/kumar/dev/nose/build/tests/nose/suite.py", line 224, in run
test(orig)
File "/Users/kumar/dev/nose/build/tests/nose/case.py", line 46, in call
return self.run(_arg, *_kwarg)
File "/Users/kumar/dev/nose/build/tests/nose/case.py", line 134, in run
self.runTest(result)
File "/Users/kumar/dev/nose/build/tests/nose/case.py", line 152, in runTest
test(result)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/unittest/case.py", line 494, in call
return self.run(_args, *_kwds)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/unittest/case.py", line 442, in run
self._executeTestPart(testMethod, outcome, isTest=True)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/unittest/case.py", line 387, in _executeTestPart
function()
File "/Users/kumar/dev/nose/build/tests/nose/case.py", line 188, in runTest
self.test(_self.arg)
File "/Users/kumar/dev/nose/build/tests/unit_tests/test_multiprocess.py", line 58, in test_mp_process_args_pickleable
runner.run(test)
File "/Users/kumar/dev/nose/build/tests/nose/plugins/multiprocess.py", line 346, in run
timeout=nexttimeout)
File "/Library/Frameworks/Python.framework/Versions/3.2/lib/python3.2/multiprocessing/queues.py", line 129, in get
if not self._poll(block and (deadline-time.time()) or 0.0):
KeyboardInterrupt

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-19T20:35:20.000Z

@jpellerin
Copy link
Member Author

The patch is in fact much smaller than it looks. 95% of the change set is simply de-denting a large section of code. A whitespace insensitive diff will show the important differences.

I'll take a quick look at adding a test right now, but if it takes more than 30 minutes, it may need to wait till the weekend, or have someone else do it.

Can you give any hints as to how specifically to add the testing?

While running the tests, I'm seeing bunches of issues I could very easily help out with.
What's your preferred method for me to submit changes?

Google Code Info:
Author: [email protected]
Created On: 2011-04-25T16:50:27.000Z

@jpellerin
Copy link
Member Author

If you get stuck you could also send me a small test suite that will reproduce the problem when run with multiprocess and I can convert it into a regression test.

As for contributing, the best way is to fork the repo on bitbucket https://bitbucket.org/kumar303/nose and commit small changes (in case we need to cherry pick). Thanks!

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-25T18:24:07.000Z

@jpellerin
Copy link
Member Author

I'm nearly done, but I'm having trouble capturing the bad KeyboardInterrupt behavior.

When run at the commandline, I see this:

../../../bin/nosetests --processes=2 --process-timeout=1 timeout.py
Process Process-1:
...
KeyboardInterrupt

E

ERROR: this test should fail when process-timeout=1

Traceback (most recent call last):
...
TimedOutException: 'timeout.test_timeout'


Ran 1 test in 1.255s

FAILED (errors=1)

But my doctest is not seeing the "Process-1" section via nose.plugins.plugintest.run_buffered(), so it is improperly passing.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T00:04:14.000Z

@jpellerin
Copy link
Member Author

This is what I have so far. The one-line edit to multitprocess.py is needed to expose the improperly passing keyboarderror test.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T00:11:47.000Z

@jpellerin
Copy link
Member Author

I've created a patch-queue here that I'm happy with.
It has full testing, even for the KeyboardInterrupt output.

https://bitbucket.org/bukzor/multiprocessing/qseries

Do you prefer that I present this as a "real" clone?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T06:35:00.000Z

@jpellerin
Copy link
Member Author

The only cleanup that you may want to do is in the functional_tests/test_multiprocessing/mp_helper.py

That file is 99% the same as nose.plugins.plugintest, but I wasn't sure how you would merge the two. You might want to use the MPFile whenever you see that the multiprocess plugin is active?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T06:38:34.000Z

@jpellerin
Copy link
Member Author

I've been doing some more testing and cleanup, but I've gotten stuck. How do I make my new multiprocessing tests be skipped when multiprocessing is not available?

I've tried to copy the method in functional_tests/doctest/test_multiprocess, but I wasn't successful. I don't understand how the _fixtures.py is hooked into the doctest.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T19:01:50.000Z

@jpellerin
Copy link
Member Author

RTFM: I got it. I should have a fully test patch set for all of these issues shortly.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T19:07:37.000Z

@jpellerin
Copy link
Member Author

The patchset now passes tox on all versions of python but python3.

I'm getting this strange failure. Any clues?

FAIL: Doctest: test_keyboardinterrupt.rst

Traceback (most recent call last):
File "/usr/lib/python3.1/doctest.py", line 2111, in runTest
raise self.failureException(self.format_failure(new.getvalue()))
AssertionError: Failed doctest test for test_keyboardinterrupt.rst
File "/home/bgolemon/trees/multiprocessing/build/tests/functional_tests/test_multiprocessing/test_keyboardinterrupt.rst", line 0


File "/home/bgolemon/trees/multiprocessing/build/tests/functional_tests/test_multiprocessing/test_keyboardinterrupt.rst", line 5, in test_keyboardinterrupt.rst
Failed example:
run_buffered( argv=[
'nosetests', '--processes=2', '--process-timeout=1',
join(dirname(file), 'support', 'timeout.py')
], plugins=[MultiProcess()])
Exception raised:
Traceback (most recent call last):
File "/home/bgolemon/trees/multiprocessing/build/tests/nose/plugins/multiprocess.py", line 350, in run
timeout=nexttimeout)
File "/usr/lib/python3.1/multiprocessing/queues.py", line 104, in get
raise Empty
queue.Empty

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.1/doctest.py", line 1246, in __run
    compileflags, 1), test.globs)
  File "<doctest test_keyboardinterrupt.rst[3]>", line 4, in <module>
    ], plugins=[MultiProcess()])
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/plugins/plugintest.py", line 391, in run_buffered
    run(*arg, **kw)
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/plugins/plugintest.py", line 380, in run

    run(*arg, **kw)
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/core.py", line 284, in run
    return TestProgram(*arg, **kw).success
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/core.py", line 118, in __init__
    **extra_args)
  File "/usr/lib/python3.1/unittest.py", line 1566, in __init__
    self.runTests()
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/core.py", line 197, in runTests
    result = self.testRunner.run(self.test)
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/plugins/multiprocess.py", line 408, in run
    worker_addr = bytes_(w.currentaddr.value,'ascii')
  File "/home/bgolemon/trees/multiprocessing/build/tests/nose/pyversion.py", line 125, in bytes_
    return bytes(s, encoding)
TypeError: encoding or errors without a string argument

Ran 337 tests in 7.715s

FAILED (SKIP=10, failures=1)

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T20:37:41.000Z

@jpellerin
Copy link
Member Author

Looks like bytes_() is not getting a str/unicode object. Maybe w.currentaddr.value is a None type or something? This might fix it:

bytes_(w.currentaddr.value or '', 'ascii')

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-26T21:22:58.000Z

@jpellerin
Copy link
Member Author

that doesn't make sense, you can see that currentaddr always gets set to

currentaddr.value = bytes_('')

Can you try printing currentaddr.value using log.info?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T21:48:22.000Z

@jpellerin
Copy link
Member Author

That's exactly the problem. bytes() won't take bytes as an argument.

/home/bgolemon/trees/multiprocessing/.tox/py31/lib/python3.1/site-packages/nose-1.0.1.dev-py3.1.egg/nose/plugins/multiprocess.py(408)run()
-> worker_addr = bytes_(w.currentaddr.value,'ascii')
(Pdb) !w.currentaddr.value
b''
(Pdb) bytes_(b'')
*** TypeError: encoding or errors without a string argument

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T22:50:44.000Z

@jpellerin
Copy link
Member Author

I misspoke. The problem is in bytes_() not bytes().

See:

(Pdb) bytes_(b'')
*** TypeError: encoding or errors without a string argument
(Pdb) bytes(b'')
b''

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T22:54:25.000Z

@jpellerin
Copy link
Member Author

hmm...... looking from the definition of bytes_, it should be calling bytes on python 3.x

{{{
if sys.version_info >= (3, 0):
def bytes_(s, encoding='utf8'):
return bytes(s, encoding)
else:
def bytes_(s, encoding=None):
return str(s)
}}}

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T22:57:01.000Z

@jpellerin
Copy link
Member Author

Shouldn't you really be using a string here, rather than bytes?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T22:58:51.000Z

@jpellerin
Copy link
Member Author

ffx: bytes with no decoding works fine, bytes with an explicit encoding doesn't accept bytes (since it can't be decoded).

The same is true for int:

(Pdb) int(10)
10
(Pdb) int(10, base=8)
*** TypeError: int() can't convert non-string with explicit base

There's a discussion here:
http://mail.python.org/pipermail/python-dev/2010-June/100778.html

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:00:30.000Z

@jpellerin
Copy link
Member Author

it was originally all strings, but due to a python 3.x quirk, the following does now work:

currentaddr = Array('c',1000)
currentaddr.value = 'does not work'

if you look at the beginnings of the thread, you can read about this issue. the original solution was to do b'', but that got modified by kumar to bytes_.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:04:17.000Z

@jpellerin
Copy link
Member Author

The only option I see is to alter bytes_() to return early if the input is bytes.

Kumar: does that seem ok, assuming it will pass tests?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:22:32.000Z

@jpellerin
Copy link
Member Author

i guess you can always have the default encoding as None? would this work?

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:33:13.000Z

@jpellerin
Copy link
Member Author

No it's required to be a string, although that doesn't make sense to me personally.
Specifying None should logically be exactly the same as not specifying.

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:34:31.000Z

@jpellerin
Copy link
Member Author

Blerg!
Kumar please take a look at my patch set. I should work on other things now...

https://bitbucket.org/bukzor/multiprocessing

writing manifest file '/home/bgolemon/trees/multiprocessing/build/lib.linux-x86_64-3.1/nose.egg-info/SOURCES.txt'
[TOX] /home/bgolemon/trees/multiprocessing$ /home/bgolemon/trees/multiprocessing/.tox/py31/bin/python selftest.py
Traceback (most recent call last):
File "selftest.py", line 52, in
"Incorrect usage of selftest.py; please see DEVELOPERS.txt")
AssertionError: Incorrect usage of selftest.py; please see DEVELOPERS.txt
________________________________ [tox summary] _________________________________
[TOX] ERROR: py31: commands failed

Google Code Info:
Author: [email protected]
Created On: 2011-04-26T23:38:14.000Z

@jpellerin
Copy link
Member Author

Hi Buck, thanks for all your work so far! I think this is the right thing to do:

{{{
if sys.version_info >= (3, 0):
def bytes_(s, encoding='utf8'):
if isinstance(s, bytes):
return s
return bytes(s, encoding)
else:
def bytes_(s, encoding=None):
return str(s)
}}}

Also, the tests are passing in tox after the selftest.py change, which you can see in our CI machine:
http://hudson.testrun.org/job/nose-stable/17/

so I'm not sure how you got that error.

I'll have some time in the next couple days to review and apply your patch series.

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-04-27T15:32:44.000Z

@jpellerin
Copy link
Member Author

Great! Thanks.

If you check that change in, I won't have to add it to my patch series.
Should probably include a tiny test like:
eq_( b'foo', bytes_(b'foo', 'trash') )

There is a difficulty with my MultiProcessFile. While it does correctly capture all the output, even during multiprocessing, there are no guarantees on ordering, which is problematic for doctests. To get a deterministic ordering I'd need to cache all the processes' output separately, then do an ordered concatenation during read(). Does that sound like a good idea? The memory implications of caching all output worries me, but I guess that's no different from the original StringIO implementation.

This interesting enough that I'll probably spend a few hours on it tonight after work. Let me know your opinion so I can steer development in the right direction.

Google Code Info:
Author: [email protected]
Created On: 2011-04-27T16:33:48.000Z

@jpellerin
Copy link
Member Author

hi guys,

excellent work solving all the problems, it's impressive.

because the current xunit plugin will not work with multiprocess, i have attached some patches to it:

http://code.google.com/p/python-nose/issues/detail?can=2&q=267&colspec=ID%20Type%20Status%20Priority%20Stars%20Milestone%20Owner%20Summary&id=267

perhaps someone will be interested in checking it out...

Google Code Info:
Author: [email protected]
Created On: 2011-04-30T13:27:09.000Z

@jpellerin
Copy link
Member Author

https://bitbucket.org/bukzor/multiprocessing

This patchset is passing tox, except for what seems to be a newly-revealed bug.
More detail here:
https://groups.google.com/d/topic/nose-dev/BuJJOP7p4Vg/discussion

Google Code Info:
Author: [email protected]
Created On: 2011-04-30T23:19:24.000Z

@jpellerin
Copy link
Member Author

I've finally completed this work. You can find it here:
https://bitbucket.org/bukzor/nose-1.0.1

These changes pass tox on all platforms, and I've taken care to make each of the changes fairly small and comprehensible. Please considering merging.

I've attached the tox log, since i've spent a full week to get it looking this way :)

Google Code Info:
Author: [email protected]
Created On: 2011-05-01T03:44:02.000Z

@jpellerin
Copy link
Member Author

Rosen and Buck, thanks for all your work on this! I've merged the final patches into the stable branch.

Google Code Info:
Author: kumar.mcmillan
Created On: 2011-05-02T19:59:00.000Z

@jpellerin
Copy link
Member Author

Great! I'm going to delete that fork now.

Google Code Info:
Author: [email protected]
Created On: 2011-05-02T20:12:29.000Z

@cool-RR
Copy link

cool-RR commented Oct 2, 2014

Hello,

It looks like I'm getting something like this problem now. When running nosetests with --processes=4 (or any other number) it simply ignores generator tests. What can I do?

@benmosher
Copy link

I am also seeing my generated tests run sequentially, under Python 2.7 + nose 1.3.7.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants