Input ID array. This sentence only contained a single

Jhoncena
6 min readDec 9, 2020
And if you think about it, they’re kind of right, AOP gives you a lot of power to add unrelated behavior into existing methods or even replace their entire logic. Of course, that might not be exactly why this pattern was introduced, and it’s certainly not the intention of the example I provided above.

And if you think about it, they’re kind of right, AOP gives you a lot of power to add unrelated behavior into existing methods or even replace their entire logic. Of course, that might not be exactly why this pattern was introduced, and it’s certainly not the intention of the example I provided above.

However, it does provide you with the ability to do whatever you want, and that, coupled with a lack of understanding of good programming practices, can cause a really big mess.

Re-use aspects across projects. You can think of aspects as components, small and decoupled pieces of code that can run anywhere. If you write your aspects correctly, you can share them across different projects with ease.
Next up is the attention mask. This mask is simply an array of 0s and 1s where each 1 represents a valid word/input ID, and a 0 represents padding. The encode_plus method outputs both input IDs and the attention mask tensors inside a dictionary:
Next up are our classification layers. These will take the output from our BERT model and produce one of our three sentiment labels — there are a lot of ways to do this, but we will keep it simple:
But before we move forward, notice the replaceMethod function, that’s where all the magic happens. That’s where the new function is created and where we decide when to call our aspect and what to do with its returned value.
The flow of information through the BERT classifier model. We have two inputs, input_ids and attention_mask, which feed into BERT. BERT outputs two tensors — of which we use the last_hidden_state tensor and discard the pooler_output tensor.

https://note.com/fisona97/n/n7d6be93fd8ff
https://fisona97.hatenablog.com/entry/2020/12/09/162513?_ga=2.201179758.1102825304.1607498717-1398986209.1607498717
https://www.peeranswer.com/question/5fd07947b65909142595f8a2
https://slexy.org/view/s20emLtyIW
https://paiza.io/projects/SH0D_KgFmZskekcoxp8NFQ
https://pastelink.net/2co7m
http://www.daikimaru.jp/fuj/fns-song-festiv-broad-v-cast01.html
http://www.daikimaru.jp/fuj/fns-song-festiv-broad-v-cast02.html
http://www.daikimaru.jp/fuj/fns-song-festiv-broad-v-cast03.html
http://www.daikimaru.jp/fuj/fns-song-festiv-broad-v-cast04.html
http://www.daikimaru.jp/fuj/fns-song-festiv-broad-v-cast05.html
http://www.daikimaru.jp/fuj/video-fns-v-festiv-fuji201.html
http://www.daikimaru.jp/fuj/video-fns-v-festiv-fuji202.html
http://www.daikimaru.jp/fuj/video-fns-v-festiv-fuji203.html
http://www.daikimaru.jp/fuj/video-fns-v-festiv-fuji204.html
http://www.daikimaru.jp/fuj/video-fns-v-festiv-fuji205.html
https://www.finehh.com/fnk/song-festiv-brd-v-cst01.html
https://www.finehh.com/fnk/song-festiv-brd-v-cst02.html
https://www.finehh.com/fnk/song-festiv-brd-v-cst03.html
https://www.finehh.com/fnk/song-festiv-brd-v-cst04.html
https://www.finehh.com/fnk/song-festiv-brd-v-cst05.html
https://www.finehh.com/fnk/video-fns-v-ftiv-fuji001.html
https://www.finehh.com/fnk/video-fns-v-ftiv-fuji002.html
https://www.finehh.com/fnk/video-fns-v-ftiv-fuji003.html
https://www.finehh.com/fnk/video-fns-v-ftiv-fuji004.html
https://www.finehh.com/fnk/video-fns-v-ftiv-fuji005.html
http://lifb.com/fjk/fns-song-festiv-broad-v-cast01.html
http://lifb.com/fjk/fns-song-festiv-broad-v-cast02.html
http://lifb.com/fjk/fns-song-festiv-broad-v-cast03.html
http://lifb.com/fjk/fns-song-festiv-broad-v-cast04.html
http://lifb.com/fjk/fns-song-festiv-broad-v-cast05.html
http://lifb.com/fjk/video-fns-v-festiv-fuji201.html
http://lifb.com/fjk/video-fns-v-festiv-fuji202.html
http://lifb.com/fjk/video-fns-v-festiv-fuji203.html
http://lifb.com/fjk/video-fns-v-festiv-fuji204.html
http://lifb.com/fjk/video-fns-v-festiv-fuji205.html

Flexible logic. The logic around your implementation of the advice and pointcuts can give you a lot of flexibility when it comes to injecting your aspects. This in turn can help to dynamically turn on and off different aspects (pun definitely intended) of your logic.
In order to formalize a bit the definition above, let’s take the example of the logger and cover these 3 concepts about AOP that are going to help you out if you decide to look further into this paradigm:
Great way to encapsulate cross-cutting concerns. I’m a big fan of encapsulation because it means you get easier to read and easier to maintain code that can be re-used all across your project.With this explanation, you could argue that creating an AOP-based library to add logging logic to existing OOP-based business logic (for example) is relatively easy. All you’d have to do is replace the existing matching methods of the target object, with a custom function that would add the aspect’s logic at the right time and then call the original method.Nothing too fancy, a basic object with 3 methods. We want to inject two aspects that are generic to all of them. One to log the attributes received and one to analyze their returned value and log their type. Two aspects, two lines of code (instead of the six we would need).

The encode_plus output. We are given a dictionary with output encodings for a single sentence. Included are the input IDs (first) and the attention mask (second). Note that the attention mask tells us to focus on the first three tokens only, ignoring the remaining padding tokens.

And the best part of it all is that just like with OOP and FP in JavaScript, you can use a mixture of AOP with FP or OOP without breaking a sweat. So let’s first understand what this aspect deal is, and how useful it can really be for JavaScript developers.
But if you ask me, just because with this tool you can cause a lot of harm, it doesn’t mean it’s bad, because you can also cause a lot of good (i.e you can extract a lot of common logic into a centralized location and inject it wherever you need, with a single line of code). That to me is a powerful tool worth learning about and definitely, worth using.
To give you a good example, imagine having written your business logic but now you realize that you have no logging code. The normal approach to this would be to centralize your logging logic inside a new module and the go function by function adding logging information.

For every BERT-based transformer model, we need two input layers that match our sequence length. We encoded our inputs to a length of 50 tokens — so we use an input shape of (50,) here:
However, if you could grab that same logger and inject it into every method you’re looking to log, at very specific points during their execution with a single line of code, then this would definitely give you a lot of value. Wouldn’t you agree?
Just because I’m a visual learner, I think showing a basic example of how you’d go about implementing a sort of inject method to add AOP-based behavior would come a long way.

Pointcut (Where): They reference the place in your target code where you want to inject the aspect. In theory, you could pinpoint anywhere in your target code when you want your code to be executed. In practice, this is not that realistic, but you can potentially specify things such as: “all methods of my object”, or “only this particular method”, or we could even get fancy with something like “all methods starting with get_”.

The injected code, although not required, is meant to have cross-cutting concerns, such as adding logging functionality, debugging metadata, or something less generic, but that could inject extra behavior without affecting the original code.

Advice (When): When do you want the aspect to run? They specify some common moments when you want your aspect’s code to be executed, such as “before”, “after”, “around”, “whenThrowing”, and the like. They, in turn, refer to the moment in time-related to the execution of the code. For the ones referring to after the code is executed, the aspects will intercept the returned value and potentially overwrite it if they needed to.

Aspects (What): These are the “aspects” or behavior you’re looking to inject into your target code. In our context (JavaScript), these will be functions that encapsulate the behavior you’re looking to add.

We have our encoded inputs IDs and attention masks, and the initialized BERT model — now, we need to add the additional layers required for inputting the input ID and attention mask arrays and the layers required for classifying the BERT output into sentiment ratings.

--

--