Task B: Product recommendation (inferred_conditions, 6 Marks) Finally, we will implement a basic…

Task B: Product recommendation (inferred_conditions, 6 Marks) Finally, we will implement a basic machine learning functionality: based on simple binary user feedback on indi-vidual products (“like”/”dislike”), our search engine should be able to automatically categorise other products as either interesting or uninteresting for the the same user.

Technically, implement a function inferred_conditions(pos_ex, neg_ex) that accepts as input two prod-uct tables based on the same feature columns, one containing positive examples, i.e., products that the user likes and the other one containing negative examples, i.e., products that the user dislikes. Your function should return as output a list of conditions conditions on the given numerical features that can be used in conjunction with the function selection from Part 1 to create a personalised recommendation for the user when applied to a table of new products. In particular, the returned conditions should be consistent with the provided list of positive examples (not deselect any of the products known to be liked by the user) but exclude as many of the known negative examples as possible. In summary, the specification of the function inferred_conditions is as follows: Input: a list of products pos_ex of positive product examples and a list of products neg_ex of negative product examples, both based on the same feature columns Output: a list of conditions conds on the numeric feature columns (i.e., those that don’t contain strings) that follow the same specification as used in Part 1 of the assignment and that satisfy the following to criteria: 1. selection(pos_examples, conds) == pos_examples, i.e., the inferred conditions select all positive examples, and 2. len(selection(neg_examples, conds)) is minimal among all condition sets that satisfy the first criterion. For instance, in our exemplary application to phones, we could imagine the following tables for positive and negative examples.

>>> > pos_ex = [[‘iPhonell’, ‘Apple’, • • • [‘Nova 5T’, ‘Huawei’, 6.1, 6.26, 3110, 3750, 1280], 497],

[‘V40 ThinQ’, ‘LG’, 6.4, >>> neg_ex = [[‘Galaxy S20’, ‘Samsung’, • • • [‘V40 ThinQ’, ‘LG’, 5.8, 3500, 6.46, 3100, 800]] 3000, 598], 1348],

• • •

[‘7T’, ‘OnePlus’, 6.3, 3300, 1200]]

Another table of new phones could be as follows. >>> new_phones = [[‘Galaxy S9’, ‘Samsung’, 5.8, 3000, 728], [‘Galaxy Note 9’, ‘Samsung’, 6.3, 3600, 700], [‘A9 2020’, ‘Oppo’, 6.4, 4000, 355]]

See Figure 2 for an illustration of the content of these three tables mapped onto the two feature dimensions of “Screen size” and “Battery capacity”. Our function allows us to infer conditions that can be used with the function selection to recommend new phones.

"Is this question part of your assignment? We can help"