My Python for loop is causing a MemoryError. How can I optimize this?

后端 未结 5 1041
抹茶落季
抹茶落季 2021-01-19 06:45

I\'m trying to compile a list of all the MAC address Apple devices will have. oui.txt tells me Apple has been assigned 77 MAC ranges to use. These ranges come i

相关标签:
5条回答
  • 2021-01-19 07:17

    range(1, 1291845633) creates a list of 1,291,845,632 elements (several GB) all at once. Use xrange(1, 1291845633) instead and it will generate elements as you need them instead of all at once.

    Regardless, it looks like you want something more like this:

    for mac in apple_mac_range: 
        for i in xrange(16777216): 
            print mac, i 
    

    Of course it's quite likely that a list of 1.3e+9 MAC addresses will not be very useful. If you want to see if a given MAC address is an Apple device, you should just check to see if the 3-byte prefix is in the list of 77. If you're trying to do access control by giving a router or something a list of all possible MAC addresses, it's unlikely that the device will accept 1.3e+9 items in its list.

    0 讨论(0)
  • 2021-01-19 07:29

    Don't use readlines

    with file('apple mac list') as f:
        for x in f:
            print x
    
    0 讨论(0)
  • 2021-01-19 07:32

    Others have answered your actual question, but I'm not really sure that's what's warranted here. Why don't you just create a class which implements __contains__ to test MAC address algorithmically? I presume you're getting a MAC and you want to test if it's possibly an iPhone MAC, so you could implement that class and then just do something like:

    if found_mac in MACTester:
      ...do work...
    

    Alternatively if you really do want an iterable sequence, you should at least use a generator instead of actually trying to fit them all in memory.

    0 讨论(0)
  • 2021-01-19 07:37

    How about:

    i = 0
    while i < 1291845633:
      print i
      i += 1
    
    0 讨论(0)
  • 2021-01-19 07:44

    Well, to start with, range(1, 1291845633) creates a list containing about a billion entries. Since each entry is at least sizeof(Py_Object), it's not too surprising that you run right out of memory. Don't do that.

    0 讨论(0)
提交回复
热议问题