Attribute-based Broadcast Encryption (ABBE) makes use of Attribute-based Encryption scheme to implement broadcast encryption. In attribute-based encryption, a ciphertext is bundled with a data access policy (policy for short) which consists of a list of attributes connected by logic gates (typically AND and OR). Each user in the system possesses a list of attributes. To determine if a user is a legitimate decryptor of the ciphertext, the system compares his attributes with the policy. If the policy is satisfied, the user is able to decrypt the ciphertext.
Compared with conventional broadcast encryption, attribute-based encryption is like separating users into different groups and the access control (using policy) is enforced by manipulating access controls in those groups. If the number of attributes is much less than the total number of users in the system, ABBE has a big advantage on efficiency. However, it cannot enforce ad-hoc access control on every individual user like conventional broadcast encryption.
Hence, some proposed using each bit of user ID as attribute, which makes the total number of attributes becomes log N (assuming N users). This number is much smaller than the total number of users and it sounds promising. However, during my study I found two major problems in this scheme when the number of legitimate users grows large.
First, in order to get a small ciphertext size, the scheme uses Boolean algebra simplification algorithm. The idea is to first list all user IDs of legitimate users in a truth table, then applies the simplification algorithm to get a shorter Boolean algebra expression without altering its result, and this problem is NP-Hard. The two most famous algorithms for this task are Quine-McCluskey algorithm and Espresso heuristic logic minimizer. The complexity of them is yet to be found, but according to my preliminary testing, the running time (for just Boolean algebra simplification) can be much longer than running encryption of Boneh's broadcast encryption. A better simplification algorithm or a better implementation of it is required.
Second, the number of sum-of-product terms output from the simplification is not a constant. In other words, the ciphertext of this scheme will have a variable size. And this leads to a question: How large can the size be? I found it difficult searching for or understanding literature which discusses about this problem. Therefore, I try to run my own test again. I tried to compute both average case and worst case on small N (since I have to generate all combinations, it is already very large even for small N, e.g. 32). From my test, the growth of average size seems to be O(log N). For worst case, however, the size seems to be N/2. In my opinion, the problem is how often the size stays at O(log N). In real application, I think a good average bound is not enough, a good worst case bound is needed. Imagine when N = 10M, N/2 will be 5M and it results in a enormous ciphertext size.
No comments:
Post a Comment