Task B: Product recommendation (inferred_conditions, 6 Marks) Finally, we will implement a basic…

Stuck with a difficult assignment? No time to get your paper done? Feeling confused? If you’re looking for reliable and timely help for assignments, you’ve come to the right place. We promise 100% original, plagiarism-free papers custom-written for you. Yes, we write every assignment from scratch and it’s solely custom-made for you.


Order a Similar Paper Order a Different Paper

Task B: Product recommendation (inferred_conditions, 6 Marks) Finally, we will implement a basic machine learning functionality: based on simple binary user feedback on indi-vidual products (“like”/”dislike”), our search engine should be able to automatically categorise other products as either interesting or uninteresting for the the same user.

Technically, implement a function inferred_conditions(pos_ex, neg_ex) that accepts as input two prod-uct tables based on the same feature columns, one containing positive examples, i.e., products that the user likes and the other one containing negative examples, i.e., products that the user dislikes. Your function should return as output a list of conditions conditions on the given numerical features that can be used in conjunction with the function selection from Part 1 to create a personalised recommendation for the user when applied to a table of new products. In particular, the returned conditions should be consistent with the provided list of positive examples (not deselect any of the products known to be liked by the user) but exclude as many of the known negative examples as possible. In summary, the specification of the function inferred_conditions is as follows: Input: a list of products pos_ex of positive product examples and a list of products neg_ex of negative product examples, both based on the same feature columns Output: a list of conditions conds on the numeric feature columns (i.e., those that don’t contain strings) that follow the same specification as used in Part 1 of the assignment and that satisfy the following to criteria: 1. selection(pos_examples, conds) == pos_examples, i.e., the inferred conditions select all positive examples, and 2. len(selection(neg_examples, conds)) is minimal among all condition sets that satisfy the first criterion. For instance, in our exemplary application to phones, we could imagine the following tables for positive and negative examples.

>>> > pos_ex = [[‘iPhonell’, ‘Apple’, • • • [‘Nova 5T’, ‘Huawei’, 6.1, 6.26, 3110, 3750, 1280], 497],

[‘V40 ThinQ’, ‘LG’, 6.4, >>> neg_ex = [[‘Galaxy S20’, ‘Samsung’, • • • [‘V40 ThinQ’, ‘LG’, 5.8, 3500, 6.46, 3100, 800]] 3000, 598], 1348],

• • •

[‘7T’, ‘OnePlus’, 6.3, 3300, 1200]]

Another table of new phones could be as follows. >>> new_phones = [[‘Galaxy S9’, ‘Samsung’, 5.8, 3000, 728], [‘Galaxy Note 9’, ‘Samsung’, 6.3, 3600, 700], [‘A9 2020’, ‘Oppo’, 6.4, 4000, 355]]

See Figure 2 for an illustration of the content of these three tables mapped onto the two feature dimensions of “Screen size” and “Battery capacity”. Our function allows us to infer conditions that can be used with the function selection to recommend new phones.

Writerbay.net

We’ve proficient writers who can handle both short and long papers, be they academic or non-academic papers, on topics ranging from soup to nuts (both literally and as the saying goes, if you know what we mean). We know how much you care about your grades and academic success. That's why we ensure the highest quality for your assignment. We're ready to help you even in the most critical situation. We're the perfect solution for all your writing needs.

Get a 15% discount on your order using the following coupon code SAVE15


Order a Similar Paper Order a Different Paper